graphql-pydantic-converter


Namegraphql-pydantic-converter JSON
Version 1.2.1 PyPI version JSON
download
home_pageNone
SummaryConvert pydantic schema to pydantic datamodel and build request from it
upload_time2024-03-21 12:31:34
maintainerNone
docs_urlNone
authorJozef Volak
requires_python<4.0,>=3.10
licenseApache 2.0
keywords graphql pydantic
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # GraphQL to Pydantic Converter & Pydantic to Query Builder

## Overview

The **GraphQL to Pydantic Converter** is a Python package designed to simplify the process of transforming GraphQL 
schemas in JSON format into Pydantic models. This tool is particularly useful for developers working with GraphQL
APIs who want to generate Pydantic models from GraphQL types for efficient data validation 
and serialization/deserialization.


## Features

- Converts GraphQL schemas in JSON format into Pydantic models.
- Build query or mutation from pydantic dataclass

## Installation

You can install the **GraphQL to Pydantic Transformer** package via pip:

```bash
pip install graphql-pydantic-converter
# or
poetry add git+https://github.com/FRINXio/frinx-services-python-api.git@main#subdirectory=utils/graphql-pydantic-converter
```

## Usage

### Cli tool to transform GraphQL JSON to Pydantic

```bash
graphql-pydantic-converter [-h] [-i INPUT_FILE] [-o OUTPUT_FILE] [--url URL] [--headers HEADERS [HEADERS ...] ]

options:
  -h, --help            show this help message and exit
  -i INPUT_FILE, --input-file INPUT_FILE 
  -o OUTPUT_FILE, --output-file OUTPUT_FILE      
  --url URL 
  --headers HEADERS [HEADERS ...] # --headers "HeaderName: HeaderValue" "HeaderName: HeaderValue"
```

### Output from cli tool

```python
import typing

from pydantic import Field
from pydantic import PrivateAttr
from graphql_pydantic_converter.graphql_types import Input
from graphql_pydantic_converter.graphql_types import Mutation
from graphql_pydantic_converter.graphql_types import Payload

Boolean: typing.TypeAlias = bool
DateTime: typing.TypeAlias = typing.Any
Float: typing.TypeAlias = float
ID: typing.TypeAlias = str
Int: typing.TypeAlias = int
JSON: typing.TypeAlias = typing.Any
String: typing.TypeAlias = str

class CreateScheduleInput(Input):
    name: String
    workflow_name: String = Field(default=None, alias='workflowName')
    workflow_version: String = Field(default=None, alias='workflowVersion')
    cron_string: String = Field(default=None, alias='cronString')
    enabled: typing.Optional[Boolean] = Field(default=None)
    parallel_runs: typing.Optional[Boolean] = Field(default=None, alias='parallelRuns')
    workflow_context: typing.Optional[String] = Field(default=None, alias='workflowContext')
    from_date: typing.Optional[DateTime] = Field(default=None, alias='fromDate')
    to_date: typing.Optional[DateTime] = Field(default=None, alias='toDate')

class Schedule(Payload):
    name: typing.Optional[bool] = Field(default=False)
    enabled: typing.Optional[bool] = Field(default=False)
    parallel_runs: typing.Optional[bool] = Field(alias='parallelRuns', default=False)
    workflow_name: typing.Optional[bool] = Field(alias='workflowName', default=False)
    workflow_version: typing.Optional[bool] = Field(alias='workflowVersion', default=False)
    cron_string: typing.Optional[bool] = Field(alias='cronString', default=False)
    workflow_context: typing.Optional[bool] = Field(alias='workflowContext', default=False)
    from_date: typing.Optional[bool] = Field(alias='fromDate', default=False)
    to_date: typing.Optional[bool] = Field(alias='toDate', default=False)
    status: typing.Optional[bool] = Field(default=False)


class CreateScheduleMutation(Mutation):
    _name: str = PrivateAttr('createSchedule')
    input: CreateScheduleInput
    payload: Schedule

CreateScheduleInput.model_rebuild()
CreateScheduleMutation.model_rebuild()
Schedule.model_rebuild()

```

### Query & Mutation builder


```python
from schedule_api import Schedule, CreateScheduleMutation, CreateScheduleInput

SCHEDULE: Schedule = Schedule(
    name=True,
    enabled=True,
    workflow_name=True,
    workflow_version=True,
    cron_string=True
)

mutation = CreateScheduleMutation(
    payload=SCHEDULE,
    input=CreateScheduleInput(
        name='name',
        workflow_name='workflowName',
        workflow_version='workflowVersion',
        cron_string='* * * * *',
        enabled=True,
        parallel_runs=False,
    )
)


```


Created query with inlined variables as string

```

mutation.render(form='inline')

mutation {
  createSchedule(
    input: {
      name: "name"
      workflowName: "workflowName"
      workflowVersion: "workflowVersion"
      cronString: "* * * * *"
      enabled: true
      parallelRuns: false
    }
  ) {
    name
    enabled
    workflowName
    workflowVersion
    cronString
  }
}

```
Created query with extracted variables

```
mutation, variables = mutation.render(form='extracted)

# mutation as a string
mutation ($input: CreateScheduleInput!) { 
  createSchedule(input: $input) { 
    name 
    enabled 
    workflowName
    workflowVersion 
    cronString 
  } 
}

# variables as a dict[str, Any]
{
  "input": {
    "name": "name",
    "workflowName": "workflowName",
    "workflowVersion": "workflowVersion",
    "cronString": "* * * * *",
    "enabled": true,
    "parallelRuns": false
  }
}
```

### Response parser

Example of generated model.py

```python 
import typing
from pydantic import BaseModel, Field
from graphql_pydantic_converter.graphql_types import ENUM

Boolean: typing.TypeAlias = bool
DateTime: typing.TypeAlias = typing.Any
Float: typing.TypeAlias = float
ID: typing.TypeAlias = str
Int: typing.TypeAlias = int
JSON: typing.TypeAlias = typing.Any
String: typing.TypeAlias = str

class Status(ENUM):
    UNKNOWN = 'UNKNOWN'
    COMPLETED = 'COMPLETED'
    FAILED = 'FAILED'
    PAUSED = 'PAUSED'
    RUNNING = 'RUNNING'
    TERMINATED = 'TERMINATED'
    TIMED_OUT = 'TIMED_OUT'

    
class SchedulePayload(BaseModel):
    name: typing.Optional[typing.Optional[String]] = Field(default=None)
    enabled: typing.Optional[typing.Optional[Boolean]] = Field(default=None)
    parallel_runs: typing.Optional[typing.Optional[Boolean]] = Field(default=None, alias='parallelRuns')
    workflow_name: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowName')
    workflow_version: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowVersion')
    cron_string: typing.Optional[typing.Optional[String]] = Field(default=None, alias='cronString')
    workflow_context: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowContext')
    from_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='fromDate')
    to_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='toDate')
    status: typing.Optional[typing.Optional[Status]] = Field(default=None)


class CreateScheduleData(BaseModel):
    create_schedule: SchedulePayload = Field(default=None, alias='createSchedule')

    
class CreateScheduleResponse(BaseModel):
    data: typing.Optional[CreateScheduleData] = Field(default=None)
    errors: typing.Optional[typing.Any] = Field(default=None)


```

### Example of response

```python

from model import CreateScheduleResponse

# send previously created request to backend service
payload = {'query': mutation.render()}
resp = requests.post(SCHELLAR_URL, json=payload)
response = resp.json()

# Example of response
# { 
#    'data': {
#         'createSchedule': {
#              'name': 'name', 
#              'enabled': True, 
#              'workflowName': 'workflowName', 
#              'workflowVersion': 'workflowVersion', 
#              'cronString': '* * * * *'
#          }
#     }
# }

schedule = CreateScheduleResponse(**response)

if schedule.errors is None:
    print(schedule.data.create_schedule.workflow_name)
else:
    print(schedule.errors)
```

# 1.0.0
- Migration to pydantic v2

# 1.0.1
- Change Map type to dict (key, value)

# 1.0.2
- Stringify mutation input strings
- Add __typename to payload
- create test folder as a module

# 1.1.0
- Support inline and extracted variables for mutation and query
- Stringify fix

# 1.1.1
- Fix missing inputs for mutation and query render in extracted format
- ENUM keys always uppercase

# 1.2.0
- Allow to populate by name

# 1.2.1
- Fix rendering issue when payload is bool
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "graphql-pydantic-converter",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": "graphql, pydantic",
    "author": "Jozef Volak",
    "author_email": "jozef.volak@elisapolystar.com",
    "download_url": "https://files.pythonhosted.org/packages/d9/a7/633df98bbc923c597876fba582d8f33aa3d9b8266d927d9f031b80e78306/graphql_pydantic_converter-1.2.1.tar.gz",
    "platform": null,
    "description": "# GraphQL to Pydantic Converter & Pydantic to Query Builder\n\n## Overview\n\nThe **GraphQL to Pydantic Converter** is a Python package designed to simplify the process of transforming GraphQL \nschemas in JSON format into Pydantic models. This tool is particularly useful for developers working with GraphQL\nAPIs who want to generate Pydantic models from GraphQL types for efficient data validation \nand serialization/deserialization.\n\n\n## Features\n\n- Converts GraphQL schemas in JSON format into Pydantic models.\n- Build query or mutation from pydantic dataclass\n\n## Installation\n\nYou can install the **GraphQL to Pydantic Transformer** package via pip:\n\n```bash\npip install graphql-pydantic-converter\n# or\npoetry add git+https://github.com/FRINXio/frinx-services-python-api.git@main#subdirectory=utils/graphql-pydantic-converter\n```\n\n## Usage\n\n### Cli tool to transform GraphQL JSON to Pydantic\n\n```bash\ngraphql-pydantic-converter [-h] [-i INPUT_FILE] [-o OUTPUT_FILE] [--url URL] [--headers HEADERS [HEADERS ...] ]\n\noptions:\n  -h, --help            show this help message and exit\n  -i INPUT_FILE, --input-file INPUT_FILE \n  -o OUTPUT_FILE, --output-file OUTPUT_FILE      \n  --url URL \n  --headers HEADERS [HEADERS ...] # --headers \"HeaderName: HeaderValue\" \"HeaderName: HeaderValue\"\n```\n\n### Output from cli tool\n\n```python\nimport typing\n\nfrom pydantic import Field\nfrom pydantic import PrivateAttr\nfrom graphql_pydantic_converter.graphql_types import Input\nfrom graphql_pydantic_converter.graphql_types import Mutation\nfrom graphql_pydantic_converter.graphql_types import Payload\n\nBoolean: typing.TypeAlias = bool\nDateTime: typing.TypeAlias = typing.Any\nFloat: typing.TypeAlias = float\nID: typing.TypeAlias = str\nInt: typing.TypeAlias = int\nJSON: typing.TypeAlias = typing.Any\nString: typing.TypeAlias = str\n\nclass CreateScheduleInput(Input):\n    name: String\n    workflow_name: String = Field(default=None, alias='workflowName')\n    workflow_version: String = Field(default=None, alias='workflowVersion')\n    cron_string: String = Field(default=None, alias='cronString')\n    enabled: typing.Optional[Boolean] = Field(default=None)\n    parallel_runs: typing.Optional[Boolean] = Field(default=None, alias='parallelRuns')\n    workflow_context: typing.Optional[String] = Field(default=None, alias='workflowContext')\n    from_date: typing.Optional[DateTime] = Field(default=None, alias='fromDate')\n    to_date: typing.Optional[DateTime] = Field(default=None, alias='toDate')\n\nclass Schedule(Payload):\n    name: typing.Optional[bool] = Field(default=False)\n    enabled: typing.Optional[bool] = Field(default=False)\n    parallel_runs: typing.Optional[bool] = Field(alias='parallelRuns', default=False)\n    workflow_name: typing.Optional[bool] = Field(alias='workflowName', default=False)\n    workflow_version: typing.Optional[bool] = Field(alias='workflowVersion', default=False)\n    cron_string: typing.Optional[bool] = Field(alias='cronString', default=False)\n    workflow_context: typing.Optional[bool] = Field(alias='workflowContext', default=False)\n    from_date: typing.Optional[bool] = Field(alias='fromDate', default=False)\n    to_date: typing.Optional[bool] = Field(alias='toDate', default=False)\n    status: typing.Optional[bool] = Field(default=False)\n\n\nclass CreateScheduleMutation(Mutation):\n    _name: str = PrivateAttr('createSchedule')\n    input: CreateScheduleInput\n    payload: Schedule\n\nCreateScheduleInput.model_rebuild()\nCreateScheduleMutation.model_rebuild()\nSchedule.model_rebuild()\n\n```\n\n### Query & Mutation builder\n\n\n```python\nfrom schedule_api import Schedule, CreateScheduleMutation, CreateScheduleInput\n\nSCHEDULE: Schedule = Schedule(\n    name=True,\n    enabled=True,\n    workflow_name=True,\n    workflow_version=True,\n    cron_string=True\n)\n\nmutation = CreateScheduleMutation(\n    payload=SCHEDULE,\n    input=CreateScheduleInput(\n        name='name',\n        workflow_name='workflowName',\n        workflow_version='workflowVersion',\n        cron_string='* * * * *',\n        enabled=True,\n        parallel_runs=False,\n    )\n)\n\n\n```\n\n\nCreated query with inlined variables as string\n\n```\n\nmutation.render(form='inline')\n\nmutation {\n  createSchedule(\n    input: {\n      name: \"name\"\n      workflowName: \"workflowName\"\n      workflowVersion: \"workflowVersion\"\n      cronString: \"* * * * *\"\n      enabled: true\n      parallelRuns: false\n    }\n  ) {\n    name\n    enabled\n    workflowName\n    workflowVersion\n    cronString\n  }\n}\n\n```\nCreated query with extracted variables\n\n```\nmutation, variables = mutation.render(form='extracted)\n\n# mutation as a string\nmutation ($input: CreateScheduleInput!) { \n  createSchedule(input: $input) { \n    name \n    enabled \n    workflowName\n    workflowVersion \n    cronString \n  } \n}\n\n# variables as a dict[str, Any]\n{\n  \"input\": {\n    \"name\": \"name\",\n    \"workflowName\": \"workflowName\",\n    \"workflowVersion\": \"workflowVersion\",\n    \"cronString\": \"* * * * *\",\n    \"enabled\": true,\n    \"parallelRuns\": false\n  }\n}\n```\n\n### Response parser\n\nExample of generated model.py\n\n```python \nimport typing\nfrom pydantic import BaseModel, Field\nfrom graphql_pydantic_converter.graphql_types import ENUM\n\nBoolean: typing.TypeAlias = bool\nDateTime: typing.TypeAlias = typing.Any\nFloat: typing.TypeAlias = float\nID: typing.TypeAlias = str\nInt: typing.TypeAlias = int\nJSON: typing.TypeAlias = typing.Any\nString: typing.TypeAlias = str\n\nclass Status(ENUM):\n    UNKNOWN = 'UNKNOWN'\n    COMPLETED = 'COMPLETED'\n    FAILED = 'FAILED'\n    PAUSED = 'PAUSED'\n    RUNNING = 'RUNNING'\n    TERMINATED = 'TERMINATED'\n    TIMED_OUT = 'TIMED_OUT'\n\n    \nclass SchedulePayload(BaseModel):\n    name: typing.Optional[typing.Optional[String]] = Field(default=None)\n    enabled: typing.Optional[typing.Optional[Boolean]] = Field(default=None)\n    parallel_runs: typing.Optional[typing.Optional[Boolean]] = Field(default=None, alias='parallelRuns')\n    workflow_name: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowName')\n    workflow_version: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowVersion')\n    cron_string: typing.Optional[typing.Optional[String]] = Field(default=None, alias='cronString')\n    workflow_context: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowContext')\n    from_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='fromDate')\n    to_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='toDate')\n    status: typing.Optional[typing.Optional[Status]] = Field(default=None)\n\n\nclass CreateScheduleData(BaseModel):\n    create_schedule: SchedulePayload = Field(default=None, alias='createSchedule')\n\n    \nclass CreateScheduleResponse(BaseModel):\n    data: typing.Optional[CreateScheduleData] = Field(default=None)\n    errors: typing.Optional[typing.Any] = Field(default=None)\n\n\n```\n\n### Example of response\n\n```python\n\nfrom model import CreateScheduleResponse\n\n# send previously created request to backend service\npayload = {'query': mutation.render()}\nresp = requests.post(SCHELLAR_URL, json=payload)\nresponse = resp.json()\n\n# Example of response\n# { \n#    'data': {\n#         'createSchedule': {\n#              'name': 'name', \n#              'enabled': True, \n#              'workflowName': 'workflowName', \n#              'workflowVersion': 'workflowVersion', \n#              'cronString': '* * * * *'\n#          }\n#     }\n# }\n\nschedule = CreateScheduleResponse(**response)\n\nif schedule.errors is None:\n    print(schedule.data.create_schedule.workflow_name)\nelse:\n    print(schedule.errors)\n```\n\n# 1.0.0\n- Migration to pydantic v2\n\n# 1.0.1\n- Change Map type to dict (key, value)\n\n# 1.0.2\n- Stringify mutation input strings\n- Add __typename to payload\n- create test folder as a module\n\n# 1.1.0\n- Support inline and extracted variables for mutation and query\n- Stringify fix\n\n# 1.1.1\n- Fix missing inputs for mutation and query render in extracted format\n- ENUM keys always uppercase\n\n# 1.2.0\n- Allow to populate by name\n\n# 1.2.1\n- Fix rendering issue when payload is bool",
    "bugtrack_url": null,
    "license": "Apache 2.0",
    "summary": "Convert pydantic schema to pydantic datamodel and build request from it",
    "version": "1.2.1",
    "project_urls": null,
    "split_keywords": [
        "graphql",
        " pydantic"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b424ab251613cb0cdc8d9ae15c26eb4f724536ddadb1c5a2778b1c3d5614234a",
                "md5": "ac8ff65341afb171672a7e2aaf1a8959",
                "sha256": "d72c16c8881096d7d00e32c89340a51b0ea3edcf7d04b07d01b08197c1c2e772"
            },
            "downloads": -1,
            "filename": "graphql_pydantic_converter-1.2.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ac8ff65341afb171672a7e2aaf1a8959",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 12401,
            "upload_time": "2024-03-21T12:31:32",
            "upload_time_iso_8601": "2024-03-21T12:31:32.844641Z",
            "url": "https://files.pythonhosted.org/packages/b4/24/ab251613cb0cdc8d9ae15c26eb4f724536ddadb1c5a2778b1c3d5614234a/graphql_pydantic_converter-1.2.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d9a7633df98bbc923c597876fba582d8f33aa3d9b8266d927d9f031b80e78306",
                "md5": "7bde6e95eee1cb8e5da3ad619ecaddb0",
                "sha256": "56d3fd375dfdceece66c37a7b654ff6b323dce8073331d7246d5151bb4a48e9e"
            },
            "downloads": -1,
            "filename": "graphql_pydantic_converter-1.2.1.tar.gz",
            "has_sig": false,
            "md5_digest": "7bde6e95eee1cb8e5da3ad619ecaddb0",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 12865,
            "upload_time": "2024-03-21T12:31:34",
            "upload_time_iso_8601": "2024-03-21T12:31:34.418749Z",
            "url": "https://files.pythonhosted.org/packages/d9/a7/633df98bbc923c597876fba582d8f33aa3d9b8266d927d9f031b80e78306/graphql_pydantic_converter-1.2.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-21 12:31:34",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "graphql-pydantic-converter"
}
        
Elapsed time: 0.21346s