# PySpark provider for Faker
[![Python package](https://github.com/spsoni/faker_pyspark/actions/workflows/python-package.yml/badge.svg)](https://github.com/spsoni/faker_pyspark/actions/workflows/python-package.yml)
[![CodeQL](https://github.com/spsoni/faker-pyspark/actions/workflows/codeql.yml/badge.svg)](https://github.com/spsoni/faker-pyspark/actions/workflows/codeql.yml)
`faker-pyspark` is a PySpark DataFrame and Schema (StructType) provider for the `Faker` Python package.
## Description
`faker-pyspark` provides PySpark based fake data for testing purposes. The definition of "fake" in this context really means "random," as the data may look real. However, I make no claims about accuracy, so do not use this as real data!
## Installation
Install with pip:
``` bash
pip install faker-pyspark
```
Add as a provider to your Faker instance:
``` python
from faker import Faker
from faker_pyspark import PySparkProvider
fake = Faker()
fake.add_provider(PySparkProvider)
```
### PySpark DataFrame, Schema and more
``` python
>>> df = fake.pyspark_dataframe()
>>> schema = fake.pyspark_schema()
>>> df_updated = fake.pyspark_update_dataframe(df)
>>> column_names = fake.pyspark_column_names()
>>> data = fake.pyspark_data_dict_using_schema(schema)
>>> data = fake.pyspark_data_dict()
```
### CLI `faker`
```bash
$ faker pyspark_schema -i faker_pyspark
$ faker pyspark_dataframe -i faker_pyspark
$ faker pyspark_schema -i faker_pyspark
$ faker pyspark_column_names -i faker_pyspark
$ faker pyspark_data_dict -i faker_pyspark
```
Raw data
{
"_id": null,
"home_page": "https://github.com/spsoni/faker-pyspark",
"name": "faker-pyspark",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8.1,<4.0.0",
"maintainer_email": "",
"keywords": "Faker, PySpark",
"author": "Sury Soni",
"author_email": "github@suryasoni.info",
"download_url": "https://files.pythonhosted.org/packages/ad/f6/0e9bc3b39117a6df5db077a97535ea0a1114090a2d6b2cc4b77f809eb70b/faker_pyspark-0.8.0.tar.gz",
"platform": null,
"description": "\n# PySpark provider for Faker\n\n[![Python package](https://github.com/spsoni/faker_pyspark/actions/workflows/python-package.yml/badge.svg)](https://github.com/spsoni/faker_pyspark/actions/workflows/python-package.yml)\n[![CodeQL](https://github.com/spsoni/faker-pyspark/actions/workflows/codeql.yml/badge.svg)](https://github.com/spsoni/faker-pyspark/actions/workflows/codeql.yml)\n\n`faker-pyspark` is a PySpark DataFrame and Schema (StructType) provider for the `Faker` Python package.\n\n\n## Description\n\n`faker-pyspark` provides PySpark based fake data for testing purposes. The definition of \"fake\" in this context really means \"random,\" as the data may look real. However, I make no claims about accuracy, so do not use this as real data!\n\n\n## Installation\n\nInstall with pip:\n\n``` bash\npip install faker-pyspark\n\n```\n\nAdd as a provider to your Faker instance:\n\n``` python\n\nfrom faker import Faker\nfrom faker_pyspark import PySparkProvider\nfake = Faker()\nfake.add_provider(PySparkProvider)\n\n```\n\n### PySpark DataFrame, Schema and more\n\n``` python\n>>> df = fake.pyspark_dataframe()\n>>> schema = fake.pyspark_schema()\n>>> df_updated = fake.pyspark_update_dataframe(df)\n>>> column_names = fake.pyspark_column_names()\n>>> data = fake.pyspark_data_dict_using_schema(schema)\n>>> data = fake.pyspark_data_dict()\n\n```\n\n### CLI `faker`\n\n```bash\n$ faker pyspark_schema -i faker_pyspark\n$ faker pyspark_dataframe -i faker_pyspark\n$ faker pyspark_schema -i faker_pyspark\n$ faker pyspark_column_names -i faker_pyspark\n$ faker pyspark_data_dict -i faker_pyspark\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "faker-pyspark is a PySpark DataFrame and Schema provider for the Faker python package",
"version": "0.8.0",
"project_urls": {
"Homepage": "https://github.com/spsoni/faker-pyspark",
"Repository": "https://github.com/spsoni/faker-pyspark"
},
"split_keywords": [
"faker",
" pyspark"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f2f2e27f4b5fddc60f47e64193661672bfde206be92d010b5050a33b3154df0b",
"md5": "5f924eead40cc6f1b9d6f16eea8efb4e",
"sha256": "d324135e4148cba30f4bca0518e996da8a9893eb49e45b6c343b82a12f0e3d5f"
},
"downloads": -1,
"filename": "faker_pyspark-0.8.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5f924eead40cc6f1b9d6f16eea8efb4e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8.1,<4.0.0",
"size": 4412,
"upload_time": "2023-06-22T10:49:04",
"upload_time_iso_8601": "2023-06-22T10:49:04.544953Z",
"url": "https://files.pythonhosted.org/packages/f2/f2/e27f4b5fddc60f47e64193661672bfde206be92d010b5050a33b3154df0b/faker_pyspark-0.8.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "adf60e9bc3b39117a6df5db077a97535ea0a1114090a2d6b2cc4b77f809eb70b",
"md5": "d6df4c82ebd6483e6199abdf56e9f6d9",
"sha256": "f47a57a9ece5706a1feb3983a6eec5212ee9ca1f46d41728c38bad07b1078f46"
},
"downloads": -1,
"filename": "faker_pyspark-0.8.0.tar.gz",
"has_sig": false,
"md5_digest": "d6df4c82ebd6483e6199abdf56e9f6d9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.1,<4.0.0",
"size": 3801,
"upload_time": "2023-06-22T10:49:06",
"upload_time_iso_8601": "2023-06-22T10:49:06.010719Z",
"url": "https://files.pythonhosted.org/packages/ad/f6/0e9bc3b39117a6df5db077a97535ea0a1114090a2d6b2cc4b77f809eb70b/faker_pyspark-0.8.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-06-22 10:49:06",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "spsoni",
"github_project": "faker-pyspark",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"tox": true,
"lcname": "faker-pyspark"
}