graphql-pydantic-converter


Namegraphql-pydantic-converter JSON
Version 1.2.2 PyPI version JSON
download
home_pageNone
SummaryConvert pydantic schema to pydantic datamodel and build request from it
upload_time2024-07-17 07:12:00
maintainerNone
docs_urlNone
authorJozef Volak
requires_python<4.0,>=3.10
licenseApache 2.0
keywords graphql pydantic
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # GraphQL to Pydantic Converter & Pydantic to Query Builder

## Overview

The **GraphQL to Pydantic Converter** is a Python package designed to simplify the process of transforming GraphQL 
schemas in JSON format into Pydantic models. This tool is particularly useful for developers working with GraphQL
APIs who want to generate Pydantic models from GraphQL types for efficient data validation 
and serialization/deserialization.


## Features

- Converts GraphQL schemas in JSON format into Pydantic models.
- Build query or mutation from pydantic dataclass

## Installation

You can install the **GraphQL to Pydantic Transformer** package via pip:

```bash
pip install graphql-pydantic-converter
# or
poetry add git+https://github.com/FRINXio/frinx-services-python-api.git@main#subdirectory=utils/graphql-pydantic-converter
```

## Usage

### Cli tool to transform GraphQL JSON to Pydantic

```bash
graphql-pydantic-converter [-h] [-i INPUT_FILE] [-o OUTPUT_FILE] [--url URL] [--headers HEADERS [HEADERS ...] ]

options:
  -h, --help            show this help message and exit
  -i INPUT_FILE, --input-file INPUT_FILE 
  -o OUTPUT_FILE, --output-file OUTPUT_FILE      
  --url URL 
  --headers HEADERS [HEADERS ...] # --headers "HeaderName: HeaderValue" "HeaderName: HeaderValue"
```

### Output from cli tool

```python
import typing

from pydantic import Field
from pydantic import PrivateAttr
from graphql_pydantic_converter.graphql_types import Input
from graphql_pydantic_converter.graphql_types import Mutation
from graphql_pydantic_converter.graphql_types import Payload

Boolean: typing.TypeAlias = bool
DateTime: typing.TypeAlias = typing.Any
Float: typing.TypeAlias = float
ID: typing.TypeAlias = str
Int: typing.TypeAlias = int
JSON: typing.TypeAlias = typing.Any
String: typing.TypeAlias = str

class CreateScheduleInput(Input):
    name: String
    workflow_name: String = Field(default=None, alias='workflowName')
    workflow_version: String = Field(default=None, alias='workflowVersion')
    cron_string: String = Field(default=None, alias='cronString')
    enabled: typing.Optional[Boolean] = Field(default=None)
    parallel_runs: typing.Optional[Boolean] = Field(default=None, alias='parallelRuns')
    workflow_context: typing.Optional[String] = Field(default=None, alias='workflowContext')
    from_date: typing.Optional[DateTime] = Field(default=None, alias='fromDate')
    to_date: typing.Optional[DateTime] = Field(default=None, alias='toDate')

class Schedule(Payload):
    name: typing.Optional[bool] = Field(default=False)
    enabled: typing.Optional[bool] = Field(default=False)
    parallel_runs: typing.Optional[bool] = Field(alias='parallelRuns', default=False)
    workflow_name: typing.Optional[bool] = Field(alias='workflowName', default=False)
    workflow_version: typing.Optional[bool] = Field(alias='workflowVersion', default=False)
    cron_string: typing.Optional[bool] = Field(alias='cronString', default=False)
    workflow_context: typing.Optional[bool] = Field(alias='workflowContext', default=False)
    from_date: typing.Optional[bool] = Field(alias='fromDate', default=False)
    to_date: typing.Optional[bool] = Field(alias='toDate', default=False)
    status: typing.Optional[bool] = Field(default=False)


class CreateScheduleMutation(Mutation):
    _name: str = PrivateAttr('createSchedule')
    input: CreateScheduleInput
    payload: Schedule

CreateScheduleInput.model_rebuild()
CreateScheduleMutation.model_rebuild()
Schedule.model_rebuild()

```

### Query & Mutation builder


```python
from schedule_api import Schedule, CreateScheduleMutation, CreateScheduleInput

SCHEDULE: Schedule = Schedule(
    name=True,
    enabled=True,
    workflow_name=True,
    workflow_version=True,
    cron_string=True
)

mutation = CreateScheduleMutation(
    payload=SCHEDULE,
    input=CreateScheduleInput(
        name='name',
        workflow_name='workflowName',
        workflow_version='workflowVersion',
        cron_string='* * * * *',
        enabled=True,
        parallel_runs=False,
    )
)


```


Created query with inlined variables as string

```

mutation.render(form='inline')

mutation {
  createSchedule(
    input: {
      name: "name"
      workflowName: "workflowName"
      workflowVersion: "workflowVersion"
      cronString: "* * * * *"
      enabled: true
      parallelRuns: false
    }
  ) {
    name
    enabled
    workflowName
    workflowVersion
    cronString
  }
}

```
Created query with extracted variables

```
mutation, variables = mutation.render(form='extracted)

# mutation as a string
mutation ($input: CreateScheduleInput!) { 
  createSchedule(input: $input) { 
    name 
    enabled 
    workflowName
    workflowVersion 
    cronString 
  } 
}

# variables as a dict[str, Any]
{
  "input": {
    "name": "name",
    "workflowName": "workflowName",
    "workflowVersion": "workflowVersion",
    "cronString": "* * * * *",
    "enabled": true,
    "parallelRuns": false
  }
}
```

### Response parser

Example of generated model.py

```python 
import typing
from pydantic import BaseModel, Field
from graphql_pydantic_converter.graphql_types import ENUM

Boolean: typing.TypeAlias = bool
DateTime: typing.TypeAlias = typing.Any
Float: typing.TypeAlias = float
ID: typing.TypeAlias = str
Int: typing.TypeAlias = int
JSON: typing.TypeAlias = typing.Any
String: typing.TypeAlias = str

class Status(ENUM):
    UNKNOWN = 'UNKNOWN'
    COMPLETED = 'COMPLETED'
    FAILED = 'FAILED'
    PAUSED = 'PAUSED'
    RUNNING = 'RUNNING'
    TERMINATED = 'TERMINATED'
    TIMED_OUT = 'TIMED_OUT'

    
class SchedulePayload(BaseModel):
    name: typing.Optional[typing.Optional[String]] = Field(default=None)
    enabled: typing.Optional[typing.Optional[Boolean]] = Field(default=None)
    parallel_runs: typing.Optional[typing.Optional[Boolean]] = Field(default=None, alias='parallelRuns')
    workflow_name: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowName')
    workflow_version: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowVersion')
    cron_string: typing.Optional[typing.Optional[String]] = Field(default=None, alias='cronString')
    workflow_context: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowContext')
    from_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='fromDate')
    to_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='toDate')
    status: typing.Optional[typing.Optional[Status]] = Field(default=None)


class CreateScheduleData(BaseModel):
    create_schedule: SchedulePayload = Field(default=None, alias='createSchedule')

    
class CreateScheduleResponse(BaseModel):
    data: typing.Optional[CreateScheduleData] = Field(default=None)
    errors: typing.Optional[typing.Any] = Field(default=None)


```

### Example of response

```python

from model import CreateScheduleResponse

# send previously created request to backend service
payload = {'query': mutation.render()}
resp = requests.post(SCHELLAR_URL, json=payload)
response = resp.json()

# Example of response
# { 
#    'data': {
#         'createSchedule': {
#              'name': 'name', 
#              'enabled': True, 
#              'workflowName': 'workflowName', 
#              'workflowVersion': 'workflowVersion', 
#              'cronString': '* * * * *'
#          }
#     }
# }

schedule = CreateScheduleResponse(**response)

if schedule.errors is None:
    print(schedule.data.create_schedule.workflow_name)
else:
    print(schedule.errors)
```

# 1.0.0
- Migration to pydantic v2

# 1.0.1
- Change Map type to dict (key, value)

# 1.0.2
- Stringify mutation input strings
- Add __typename to payload
- create test folder as a module

# 1.1.0
- Support inline and extracted variables for mutation and query
- Stringify fix

# 1.1.1
- Fix missing inputs for mutation and query render in extracted format
- ENUM keys always uppercase

# 1.2.0
- Allow to populate by name

# 1.2.1
- Fix rendering issue when payload is bool

# 1.2.1
- Security updates
- Disable pyright check for reportCallIssue
- Disable pyright check for reportIncompatibleVariableOverride

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "graphql-pydantic-converter",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": "graphql, pydantic",
    "author": "Jozef Volak",
    "author_email": "jozef.volak@elisapolystar.com",
    "download_url": "https://files.pythonhosted.org/packages/18/4f/23fe6ec5c5526de945904b7b222b70254f763f2024b9700bc4d0576fef10/graphql_pydantic_converter-1.2.2.tar.gz",
    "platform": null,
    "description": "# GraphQL to Pydantic Converter & Pydantic to Query Builder\n\n## Overview\n\nThe **GraphQL to Pydantic Converter** is a Python package designed to simplify the process of transforming GraphQL \nschemas in JSON format into Pydantic models. This tool is particularly useful for developers working with GraphQL\nAPIs who want to generate Pydantic models from GraphQL types for efficient data validation \nand serialization/deserialization.\n\n\n## Features\n\n- Converts GraphQL schemas in JSON format into Pydantic models.\n- Build query or mutation from pydantic dataclass\n\n## Installation\n\nYou can install the **GraphQL to Pydantic Transformer** package via pip:\n\n```bash\npip install graphql-pydantic-converter\n# or\npoetry add git+https://github.com/FRINXio/frinx-services-python-api.git@main#subdirectory=utils/graphql-pydantic-converter\n```\n\n## Usage\n\n### Cli tool to transform GraphQL JSON to Pydantic\n\n```bash\ngraphql-pydantic-converter [-h] [-i INPUT_FILE] [-o OUTPUT_FILE] [--url URL] [--headers HEADERS [HEADERS ...] ]\n\noptions:\n  -h, --help            show this help message and exit\n  -i INPUT_FILE, --input-file INPUT_FILE \n  -o OUTPUT_FILE, --output-file OUTPUT_FILE      \n  --url URL \n  --headers HEADERS [HEADERS ...] # --headers \"HeaderName: HeaderValue\" \"HeaderName: HeaderValue\"\n```\n\n### Output from cli tool\n\n```python\nimport typing\n\nfrom pydantic import Field\nfrom pydantic import PrivateAttr\nfrom graphql_pydantic_converter.graphql_types import Input\nfrom graphql_pydantic_converter.graphql_types import Mutation\nfrom graphql_pydantic_converter.graphql_types import Payload\n\nBoolean: typing.TypeAlias = bool\nDateTime: typing.TypeAlias = typing.Any\nFloat: typing.TypeAlias = float\nID: typing.TypeAlias = str\nInt: typing.TypeAlias = int\nJSON: typing.TypeAlias = typing.Any\nString: typing.TypeAlias = str\n\nclass CreateScheduleInput(Input):\n    name: String\n    workflow_name: String = Field(default=None, alias='workflowName')\n    workflow_version: String = Field(default=None, alias='workflowVersion')\n    cron_string: String = Field(default=None, alias='cronString')\n    enabled: typing.Optional[Boolean] = Field(default=None)\n    parallel_runs: typing.Optional[Boolean] = Field(default=None, alias='parallelRuns')\n    workflow_context: typing.Optional[String] = Field(default=None, alias='workflowContext')\n    from_date: typing.Optional[DateTime] = Field(default=None, alias='fromDate')\n    to_date: typing.Optional[DateTime] = Field(default=None, alias='toDate')\n\nclass Schedule(Payload):\n    name: typing.Optional[bool] = Field(default=False)\n    enabled: typing.Optional[bool] = Field(default=False)\n    parallel_runs: typing.Optional[bool] = Field(alias='parallelRuns', default=False)\n    workflow_name: typing.Optional[bool] = Field(alias='workflowName', default=False)\n    workflow_version: typing.Optional[bool] = Field(alias='workflowVersion', default=False)\n    cron_string: typing.Optional[bool] = Field(alias='cronString', default=False)\n    workflow_context: typing.Optional[bool] = Field(alias='workflowContext', default=False)\n    from_date: typing.Optional[bool] = Field(alias='fromDate', default=False)\n    to_date: typing.Optional[bool] = Field(alias='toDate', default=False)\n    status: typing.Optional[bool] = Field(default=False)\n\n\nclass CreateScheduleMutation(Mutation):\n    _name: str = PrivateAttr('createSchedule')\n    input: CreateScheduleInput\n    payload: Schedule\n\nCreateScheduleInput.model_rebuild()\nCreateScheduleMutation.model_rebuild()\nSchedule.model_rebuild()\n\n```\n\n### Query & Mutation builder\n\n\n```python\nfrom schedule_api import Schedule, CreateScheduleMutation, CreateScheduleInput\n\nSCHEDULE: Schedule = Schedule(\n    name=True,\n    enabled=True,\n    workflow_name=True,\n    workflow_version=True,\n    cron_string=True\n)\n\nmutation = CreateScheduleMutation(\n    payload=SCHEDULE,\n    input=CreateScheduleInput(\n        name='name',\n        workflow_name='workflowName',\n        workflow_version='workflowVersion',\n        cron_string='* * * * *',\n        enabled=True,\n        parallel_runs=False,\n    )\n)\n\n\n```\n\n\nCreated query with inlined variables as string\n\n```\n\nmutation.render(form='inline')\n\nmutation {\n  createSchedule(\n    input: {\n      name: \"name\"\n      workflowName: \"workflowName\"\n      workflowVersion: \"workflowVersion\"\n      cronString: \"* * * * *\"\n      enabled: true\n      parallelRuns: false\n    }\n  ) {\n    name\n    enabled\n    workflowName\n    workflowVersion\n    cronString\n  }\n}\n\n```\nCreated query with extracted variables\n\n```\nmutation, variables = mutation.render(form='extracted)\n\n# mutation as a string\nmutation ($input: CreateScheduleInput!) { \n  createSchedule(input: $input) { \n    name \n    enabled \n    workflowName\n    workflowVersion \n    cronString \n  } \n}\n\n# variables as a dict[str, Any]\n{\n  \"input\": {\n    \"name\": \"name\",\n    \"workflowName\": \"workflowName\",\n    \"workflowVersion\": \"workflowVersion\",\n    \"cronString\": \"* * * * *\",\n    \"enabled\": true,\n    \"parallelRuns\": false\n  }\n}\n```\n\n### Response parser\n\nExample of generated model.py\n\n```python \nimport typing\nfrom pydantic import BaseModel, Field\nfrom graphql_pydantic_converter.graphql_types import ENUM\n\nBoolean: typing.TypeAlias = bool\nDateTime: typing.TypeAlias = typing.Any\nFloat: typing.TypeAlias = float\nID: typing.TypeAlias = str\nInt: typing.TypeAlias = int\nJSON: typing.TypeAlias = typing.Any\nString: typing.TypeAlias = str\n\nclass Status(ENUM):\n    UNKNOWN = 'UNKNOWN'\n    COMPLETED = 'COMPLETED'\n    FAILED = 'FAILED'\n    PAUSED = 'PAUSED'\n    RUNNING = 'RUNNING'\n    TERMINATED = 'TERMINATED'\n    TIMED_OUT = 'TIMED_OUT'\n\n    \nclass SchedulePayload(BaseModel):\n    name: typing.Optional[typing.Optional[String]] = Field(default=None)\n    enabled: typing.Optional[typing.Optional[Boolean]] = Field(default=None)\n    parallel_runs: typing.Optional[typing.Optional[Boolean]] = Field(default=None, alias='parallelRuns')\n    workflow_name: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowName')\n    workflow_version: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowVersion')\n    cron_string: typing.Optional[typing.Optional[String]] = Field(default=None, alias='cronString')\n    workflow_context: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowContext')\n    from_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='fromDate')\n    to_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='toDate')\n    status: typing.Optional[typing.Optional[Status]] = Field(default=None)\n\n\nclass CreateScheduleData(BaseModel):\n    create_schedule: SchedulePayload = Field(default=None, alias='createSchedule')\n\n    \nclass CreateScheduleResponse(BaseModel):\n    data: typing.Optional[CreateScheduleData] = Field(default=None)\n    errors: typing.Optional[typing.Any] = Field(default=None)\n\n\n```\n\n### Example of response\n\n```python\n\nfrom model import CreateScheduleResponse\n\n# send previously created request to backend service\npayload = {'query': mutation.render()}\nresp = requests.post(SCHELLAR_URL, json=payload)\nresponse = resp.json()\n\n# Example of response\n# { \n#    'data': {\n#         'createSchedule': {\n#              'name': 'name', \n#              'enabled': True, \n#              'workflowName': 'workflowName', \n#              'workflowVersion': 'workflowVersion', \n#              'cronString': '* * * * *'\n#          }\n#     }\n# }\n\nschedule = CreateScheduleResponse(**response)\n\nif schedule.errors is None:\n    print(schedule.data.create_schedule.workflow_name)\nelse:\n    print(schedule.errors)\n```\n\n# 1.0.0\n- Migration to pydantic v2\n\n# 1.0.1\n- Change Map type to dict (key, value)\n\n# 1.0.2\n- Stringify mutation input strings\n- Add __typename to payload\n- create test folder as a module\n\n# 1.1.0\n- Support inline and extracted variables for mutation and query\n- Stringify fix\n\n# 1.1.1\n- Fix missing inputs for mutation and query render in extracted format\n- ENUM keys always uppercase\n\n# 1.2.0\n- Allow to populate by name\n\n# 1.2.1\n- Fix rendering issue when payload is bool\n\n# 1.2.1\n- Security updates\n- Disable pyright check for reportCallIssue\n- Disable pyright check for reportIncompatibleVariableOverride\n",
    "bugtrack_url": null,
    "license": "Apache 2.0",
    "summary": "Convert pydantic schema to pydantic datamodel and build request from it",
    "version": "1.2.2",
    "project_urls": null,
    "split_keywords": [
        "graphql",
        " pydantic"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8cb31b093df8ff345dda4a11c0385e6e4a6d01dc86aaf8d090169daf166acd60",
                "md5": "2e2dd4c23a894c061529ec6f4915e441",
                "sha256": "6fe78e92bc2659b0f1849cbf044601a482652f0576c1e234bb7559c10ab7dc5f"
            },
            "downloads": -1,
            "filename": "graphql_pydantic_converter-1.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2e2dd4c23a894c061529ec6f4915e441",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 12460,
            "upload_time": "2024-07-17T07:11:58",
            "upload_time_iso_8601": "2024-07-17T07:11:58.671048Z",
            "url": "https://files.pythonhosted.org/packages/8c/b3/1b093df8ff345dda4a11c0385e6e4a6d01dc86aaf8d090169daf166acd60/graphql_pydantic_converter-1.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "184f23fe6ec5c5526de945904b7b222b70254f763f2024b9700bc4d0576fef10",
                "md5": "0122c2ce07a84d405ebe9e4aa62a0257",
                "sha256": "97f9e8c70a63c63054048969fcfc96f143d8f7cb8499c6f3abd06d2a616bdbd8"
            },
            "downloads": -1,
            "filename": "graphql_pydantic_converter-1.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "0122c2ce07a84d405ebe9e4aa62a0257",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 12993,
            "upload_time": "2024-07-17T07:12:00",
            "upload_time_iso_8601": "2024-07-17T07:12:00.107760Z",
            "url": "https://files.pythonhosted.org/packages/18/4f/23fe6ec5c5526de945904b7b222b70254f763f2024b9700bc4d0576fef10/graphql_pydantic_converter-1.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-17 07:12:00",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "graphql-pydantic-converter"
}
        
Elapsed time: 0.30758s