sparkpipelineframework.testing


Namesparkpipelineframework.testing JSON
Version 2.1.2 PyPI version JSON
download
home_pagehttps://github.com/imranq2/SparkPipelineFramework.Testing
SummaryTesting Framework for SparkPipelineFramework
upload_time2024-01-03 07:50:51
maintainer
docs_urlNone
authorImran Qureshi
requires_python>=3.6
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![Build and Test](https://github.com/imranq2/SparkPipelineFramework.Testing/actions/workflows/build_and_Test.yml/badge.svg)](https://github.com/imranq2/SparkPipelineFramework.Testing/actions/workflows/build_and_Test.yml)

[![Upload Python Package](https://github.com/imranq2/SparkPipelineFramework.Testing/actions/workflows/python-publish.yml/badge.svg)](https://github.com/imranq2/SparkPipelineFramework.Testing/actions/workflows/python-publish.yml)

[![Known Vulnerabilities](https://snyk.io/test/github/imranq2/SparkPipelineFramework.Testing/badge.svg?targetFile=requirements.txt)](https://snyk.io/test/github/imranq2/SparkPipelineFramework.Testing?targetFile=requirements.txt)

# SparkPipelineFramework.Tests
Testing framework that can tests SparkPipelineFramework library by just providing input files to setup before running the transformer and output files to use for verifying the output

## Usage
1. Create a folder structure similar to the folder structure of your library in SparkPipelineFramework (This is how the Testing Framework finds the Transformer to run)
2. Create an input folder and put in files that represent the input views.  These files can be csv, json or parquet
3. (Optionally) Create an input_schema folder and put in any schemas you want applied to the above views.  This follows the Spark Json Schema format.
4. (Optional) Create an output folder and put in files that represent the output views you expect.  These files can be csv, json or parquet
5. (Optional) Create an output_schema folder and put in any schemas you want applied to the output views
6. Copy the following test code and put it in a test file in this folder

```python
from pathlib import Path

from pyspark.sql import SparkSession

from spark_pipeline_framework_testing.test_runner import SparkPipelineFrameworkTestRunner


def test_folder(spark_session: SparkSession) -> None:
    data_dir: Path = Path(__file__).parent.joinpath('./')

    SparkPipelineFrameworkTestRunner.run_tests(spark_session=spark_session, folder_path=data_dir)
```
7. Now just run this test.

Note: the test finds files in sub-folders too.

## Example
For the transformer defined here: https://github.com/imranq2/SparkPipelineFramework.Testing/tree/main/library/features/people/my_people_feature
You can find the test here: https://github.com/imranq2/SparkPipelineFramework.Testing/tree/main/tests/library/features/people/my_people_feature

## Publishing a new package
1. Create a new release
2. The GitHub Action should automatically kick in and publish the package
3. You can see the status in the Actions tab

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/imranq2/SparkPipelineFramework.Testing",
    "name": "sparkpipelineframework.testing",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "",
    "author": "Imran Qureshi",
    "author_email": "imranq2@hotmail.com",
    "download_url": "https://files.pythonhosted.org/packages/b1/61/2d0b330e14b20d184d0e4bb5366c0b712dbbe32e8d5dfdeaea2546fe5024/sparkpipelineframework.testing-2.1.2.tar.gz",
    "platform": null,
    "description": "[![Build and Test](https://github.com/imranq2/SparkPipelineFramework.Testing/actions/workflows/build_and_Test.yml/badge.svg)](https://github.com/imranq2/SparkPipelineFramework.Testing/actions/workflows/build_and_Test.yml)\n\n[![Upload Python Package](https://github.com/imranq2/SparkPipelineFramework.Testing/actions/workflows/python-publish.yml/badge.svg)](https://github.com/imranq2/SparkPipelineFramework.Testing/actions/workflows/python-publish.yml)\n\n[![Known Vulnerabilities](https://snyk.io/test/github/imranq2/SparkPipelineFramework.Testing/badge.svg?targetFile=requirements.txt)](https://snyk.io/test/github/imranq2/SparkPipelineFramework.Testing?targetFile=requirements.txt)\n\n# SparkPipelineFramework.Tests\nTesting framework that can tests SparkPipelineFramework library by just providing input files to setup before running the transformer and output files to use for verifying the output\n\n## Usage\n1. Create a folder structure similar to the folder structure of your library in SparkPipelineFramework (This is how the Testing Framework finds the Transformer to run)\n2. Create an input folder and put in files that represent the input views.  These files can be csv, json or parquet\n3. (Optionally) Create an input_schema folder and put in any schemas you want applied to the above views.  This follows the Spark Json Schema format.\n4. (Optional) Create an output folder and put in files that represent the output views you expect.  These files can be csv, json or parquet\n5. (Optional) Create an output_schema folder and put in any schemas you want applied to the output views\n6. Copy the following test code and put it in a test file in this folder\n\n```python\nfrom pathlib import Path\n\nfrom pyspark.sql import SparkSession\n\nfrom spark_pipeline_framework_testing.test_runner import SparkPipelineFrameworkTestRunner\n\n\ndef test_folder(spark_session: SparkSession) -> None:\n    data_dir: Path = Path(__file__).parent.joinpath('./')\n\n    SparkPipelineFrameworkTestRunner.run_tests(spark_session=spark_session, folder_path=data_dir)\n```\n7. Now just run this test.\n\nNote: the test finds files in sub-folders too.\n\n## Example\nFor the transformer defined here: https://github.com/imranq2/SparkPipelineFramework.Testing/tree/main/library/features/people/my_people_feature\nYou can find the test here: https://github.com/imranq2/SparkPipelineFramework.Testing/tree/main/tests/library/features/people/my_people_feature\n\n## Publishing a new package\n1. Create a new release\n2. The GitHub Action should automatically kick in and publish the package\n3. You can see the status in the Actions tab\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Testing Framework for SparkPipelineFramework",
    "version": "2.1.2",
    "project_urls": {
        "Homepage": "https://github.com/imranq2/SparkPipelineFramework.Testing"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "773ba87d6987a080814c6d93304c7c5ab067c1799e6dc63bd419aa4fead2be25",
                "md5": "4617a85caa765ccd1e196d2262d1b5bc",
                "sha256": "27cf96e411e3f31106223a1826edec92e7156bd392d609e96cbdc4905d48ce0a"
            },
            "downloads": -1,
            "filename": "sparkpipelineframework.testing-2.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4617a85caa765ccd1e196d2262d1b5bc",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 93971,
            "upload_time": "2024-01-03T07:50:49",
            "upload_time_iso_8601": "2024-01-03T07:50:49.407469Z",
            "url": "https://files.pythonhosted.org/packages/77/3b/a87d6987a080814c6d93304c7c5ab067c1799e6dc63bd419aa4fead2be25/sparkpipelineframework.testing-2.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b1612d0b330e14b20d184d0e4bb5366c0b712dbbe32e8d5dfdeaea2546fe5024",
                "md5": "a7780a1676ad26813fd2094dae694203",
                "sha256": "2002829f71dd91a8b02a8c9879653e5944df9f3d215ada26aac353edd4358f9f"
            },
            "downloads": -1,
            "filename": "sparkpipelineframework.testing-2.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "a7780a1676ad26813fd2094dae694203",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 46730,
            "upload_time": "2024-01-03T07:50:51",
            "upload_time_iso_8601": "2024-01-03T07:50:51.299460Z",
            "url": "https://files.pythonhosted.org/packages/b1/61/2d0b330e14b20d184d0e4bb5366c0b712dbbe32e8d5dfdeaea2546fe5024/sparkpipelineframework.testing-2.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-03 07:50:51",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "imranq2",
    "github_project": "SparkPipelineFramework.Testing",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "sparkpipelineframework.testing"
}
        
Elapsed time: 0.19234s