pydantic-resolve


Namepydantic-resolve JSON
Version 1.13.0 PyPI version JSON
download
home_pagehttps://github.com/allmonday/pydantic_resolve
SummaryA business model friendly data orchestration tool, simple way to implement the core concept of clean architecture.
upload_time2025-08-27 12:15:40
maintainerNone
docs_urlNone
authortangkikodo
requires_python<4.0,>=3.7
licenseMIT
keywords pydantic fastapi
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            [![pypi](https://img.shields.io/pypi/v/pydantic-resolve.svg)](https://pypi.python.org/pypi/pydantic-resolve)
[![PyPI Downloads](https://static.pepy.tech/badge/pydantic-resolve/month)](https://pepy.tech/projects/pydantic-resolve)
![Python Versions](https://img.shields.io/pypi/pyversions/pydantic-resolve)
[![CI](https://github.com/allmonday/pydantic_resolve/actions/workflows/ci.yml/badge.svg)](https://github.com/allmonday/pydantic_resolve/actions/workflows/ci.yml)

Pydantic-resolve is a framework for composing complex data structures with an intuitive, declarative, resolver-based way, and then let the data easy to understand and adjust.

It provides three major functions to facilitate the acquisition and modification of multi-layered data.

- pluggable resolve methods and post methods, they can define how to fetch and modify nodes.
- transporting field data from ancestor nodes to their descendant nodes, through multiple layers.
- collecting data from any descendants nodes to their ancestor nodes, through multiple layers.

It supports:

- pydantic v1
- pydantic v2
- dataclass `from pydantic.dataclasses import dataclass`

If you have experience with GraphQL, this article provides comprehensive insights: [Resolver Pattern: A Better Alternative to GraphQL in BFF (api-integration).](https://github.com/allmonday/resolver-vs-graphql/blob/master/README-en.md)

It could be seamlessly integrated with modern Python web frameworks including FastAPI, Litestar, and Django-ninja.

## Hello world

Here is the root data we have, it belongs to type BaseStory:

```json
[
  { "id": 1, "name": "story - 1" },
  { "id": 2, "name": "story - 2" }
]
```

First let's define Story, which inherit from BaseStory, and add tasks field.

All the details of query process are encapsulated inside the dataloader, no worry about N+1 query.

```python
from pydantic_resolve import Resolver
from biz_models import BaseTask, BaseStory, BaseUser
from biz_services import UserLoader, StoryTaskLoader

class Story(BaseStory):
    tasks: list[BaseTask] = []
    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):
        return loader.load(self.id)

stories = [Story(**s) for s in await query_stories()]
data = await Resolver().resolve(stories)
```

Then the data looks like:

```json
[
  {
    "id": 1,
    "name": "story - 1",
    "tasks": [
      {
        "id": 1,
        "name": "design",
        "user_id": 2
      }
    ]
  },
  {
    "id": 2,
    "name": "story - 2",
    "tasks": [
      {
        "id": 2,
        "name": "add ut",
        "user_id": 2
      }
    ]
  }
]
```

If you continue extend the BaseTask and replace the return type of Story.tasks

```python
class Task(BaseTask):
    user: Optional[BaseUser] = None
    def resolve_user(self, loader=Loader(UserLoader)):
        return loader.load(self.assignee_id) if self.assignee_id else None

class Story(BaseStory):
    tasks: list[Task] = [] # BaseTask -> Task
    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):
        return loader.load(self.id)
```

You will get the user info immediately.

```json
[
  {
    "id": 1,
    "name": "story - 1",
    "tasks": [
      {
        "id": 1,
        "name": "design",
        "user": {
          "id": 1,
          "name": "tangkikodo"
        }
      }
    ]
  },
  {
    "id": 2,
    "name": "story - 2",
    "tasks": [
      {
        "id": 2,
        "name": "add ut",
        "user": {
          "id": 2,
          "name": "john"
        }
      }
    ]
  }
]
```

With `openapi-ts` you could transfer those types to the clients seamlessly.

## Installation

```
pip install pydantic-resolve
```

Starting from pydantic-resolve v1.11.0, both pydantic v1 and v2 are supported.

## Documentation

- **Documentation**: https://allmonday.github.io/pydantic-resolve/
- **Demo**: https://github.com/allmonday/pydantic-resolve-demo
- **Composition-Oriented Pattern**: https://github.com/allmonday/composition-oriented-development-pattern

## 3 Steps to construct complex data

Let's take Agile's Story for example.

### 1. Define Domain Models

Establish entity relationships as foundational data models

(which is stable, serves as architectural blueprint)

<img width="630px" alt="image" src="https://github.com/user-attachments/assets/2656f72e-1af5-467a-96f9-cab95760b720" />

```python
from pydantic import BaseModel

class BaseStory(BaseModel):
    id: int
    name: str
    assignee_id: Optional[int]
    report_to: Optional[int]

class BaseTask(BaseModel):
    id: int
    story_id: int
    name: str
    estimate: int
    done: bool
    assignee_id: Optional[int]

class BaseUser(BaseModel):
    id: int
    name: str
    title: str
```

```python
from aiodataloader import DataLoader
from pydantic_resolve import build_list, build_object

class StoryTaskLoader(DataLoader):
    async def batch_load_fn(self, keys: list[int]):
        tasks = await get_tasks_by_story_ids(keys)
        return build_list(tasks, keys, lambda x: x.story_id)

class UserLoader(DataLoader):
    async def batch_load_fn(self, keys: list[int]):
        users = await get_tuser_by_ids(keys)
        return build_object(users, keys, lambda x: x.id)
```

DataLoader implementations support flexible data sources, from database queries to microservice RPC calls. (It could be replaced in future optimization)

### 2. Compose Business Models

Based on a our business logic, create domain-specific data structures through selective schemas and relationship dataloader

We need to extend `tasks`, `assignee` and `reporter` for `Story`, extend `user` for `Task`

Extending new fields is dynamic, all based on business requirement, but the relationships / loader are restricted by the definition from step 1.

<img width="630px" alt="image" src="https://github.com/user-attachments/assets/ffc74e60-0670-475c-85ab-cb0d03460813" />

```python
from pydantic_resolve import Loader

class Task(BaseTask):
    user: Optional[BaseUser] = None
    def resolve_user(self, loader=Loader(UserLoader)):
        return loader.load(self.assignee_id) if self.assignee_id else None

class Story(BaseStory):
    tasks: list[Task] = []
    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):
        return loader.load(self.id)

    assignee: Optional[BaseUser] = None
    def resolve_assignee(self, loader=Loader(UserLoader)):
        return loader.load(self.assignee_id) if self.assignee_id else None

    reporter: Optional[BaseUser] = None
    def resolve_reporter(self, loader=Loader(UserLoader)):
        return loader.load(self.report_to) if self.report_to else None
```

Utilize `ensure_subset` decorator for field validation and consistency enforcement:

```python
@ensure_subset(BaseStory)
class Story(BaseModel):
    id: int
    assignee_id: int
    report_to: int

    tasks: list[BaseTask] = []
    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):
        return loader.load(self.id)

```

> Once this combination is stable, you can consider optimizing with specialized queries to replace DataLoader for enhanced performance, eg ORM's join relationship

### 3. Implement View-Layer Transformations

Dataset from data-persistent layer can not meet all requirements, we always need some extra computed fields or adjust the data structure.

post method could read fields from ancestor, collect fields from descendants or modify the data fetched by resolve method.

#### Case 1: Aggregate or collect items

<img width="630px" alt="image" src="https://github.com/user-attachments/assets/2e3b1345-9e5e-489b-a81d-dc220b9d6334" />

`__pydantic_resolve_collect__` can collect fields from current node and then send them to ancestor node who declared `related_users`.

```python
from pydantic_resolve import Loader, Collector

class Task(BaseTask):
    __pydantic_resolve_collect__ = {'user': 'related_users'}  # Propagate user to collector: 'related_users'

    user: Optional[BaseUser] = None
    def resolve_user(self, loader=Loader(UserLoader)):
        return loader.load(self.assignee_id)

class Story(BaseStory):
    tasks: list[Task] = []
    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):
        return loader.load(self.id)

    assignee: Optional[BaseUser] = None
    def resolve_assignee(self, loader=Loader(UserLoader)):
        return loader.load(self.assignee_id)

    reporter: Optional[BaseUser] = None
    def resolve_reporter(self, loader=Loader(UserLoader)):
        return loader.load(self.report_to)

    # ---------- Post-processing ------------
    related_users: list[BaseUser] = []
    def post_related_users(self, collector=Collector(alias='related_users')):
        return collector.values()
```

#### Case 2: Compute extra fields

<img width="630px" alt="image" src="https://github.com/user-attachments/assets/fd5897d6-1c6a-49ec-aab0-495070054b83" />

post methods are executed after all resolve_methods are resolved, so we can use it to calculate extra fields.

```python
class Story(BaseStory):
    tasks: list[Task] = []
    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):
        return loader.load(self.id)

    assignee: Optional[BaseUser] = None
    def resolve_assignee(self, loader=Loader(UserLoader)):
        return loader.load(self.assignee_id)

    reporter: Optional[BaseUser] = None
    def resolve_reporter(self, loader=Loader(UserLoader)):
        return loader.load(self.report_to)

    # ---------- Post-processing ------------
    total_estimate: int = 0
    def post_total_estimate(self):
        return sum(task.estimate for task in self.tasks)
```

### Case 3: Propagate ancestor data through ancestor_context

`__pydantic_resolve_expose__` could expose specific fields from current node to it's descendant.

alias_names should be global unique inside root node.

descendant nodes could read the value with `ancestor_context[alias_name]`.

```python
from pydantic_resolve import Loader

class Task(BaseTask):
    user: Optional[BaseUser] = None
    def resolve_user(self, loader=Loader(UserLoader)):
        return loader.load(self.assignee_id)

    # ---------- Post-processing ------------
    def post_name(self, ancestor_context):  # Access story.name from parent context
        return f'{ancestor_context['story_name']} - {self.name}'

class Story(BaseStory):
    __pydantic_resolve_expose__ = {'name': 'story_name'}

    tasks: list[Task] = []
    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):
        return loader.load(self.id)

    assignee: Optional[BaseUser] = None
    def resolve_assignee(self, loader=Loader(UserLoader)):
        return loader.load(self.assignee_id)

    reporter: Optional[BaseUser] = None
    def resolve_reporter(self, loader=Loader(UserLoader)):
        return loader.load(self.report_to)
```

### 4. Execute Resolver().resolve()

```python
from pydantic_resolve import Resolver

stories = [Story(**s) for s in await query_stories()]
data = await Resolver().resolve(stories)
```

`query_stories()` returns `BaseStory` list, after we transformed it into `Story`, resolve and post fields are initialized as default value, after `Resolver().resolve()` finished, all these fields will be resolved and post-processed to what we expected.

## Testing and Coverage

```shell
tox
```

```shell
tox -e coverage
python -m http.server
```

Current test coverage: 97%

## Community

[Discord](https://discord.com/channels/1197929379951558797/1197929379951558800)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/allmonday/pydantic_resolve",
    "name": "pydantic-resolve",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.7",
    "maintainer_email": null,
    "keywords": "pydantic, fastapi",
    "author": "tangkikodo",
    "author_email": "allmonday@126.com",
    "download_url": "https://files.pythonhosted.org/packages/d7/c3/11c594da8fd59ee3cee8eb252c9c259e748d3b74c0dab3dbe7ee2ff4fdbf/pydantic_resolve-1.13.0.tar.gz",
    "platform": null,
    "description": "[![pypi](https://img.shields.io/pypi/v/pydantic-resolve.svg)](https://pypi.python.org/pypi/pydantic-resolve)\n[![PyPI Downloads](https://static.pepy.tech/badge/pydantic-resolve/month)](https://pepy.tech/projects/pydantic-resolve)\n![Python Versions](https://img.shields.io/pypi/pyversions/pydantic-resolve)\n[![CI](https://github.com/allmonday/pydantic_resolve/actions/workflows/ci.yml/badge.svg)](https://github.com/allmonday/pydantic_resolve/actions/workflows/ci.yml)\n\nPydantic-resolve is a framework for composing complex data structures with an intuitive, declarative, resolver-based way, and then let the data easy to understand and adjust.\n\nIt provides three major functions to facilitate the acquisition and modification of multi-layered data.\n\n- pluggable resolve methods and post methods, they can define how to fetch and modify nodes.\n- transporting field data from ancestor nodes to their descendant nodes, through multiple layers.\n- collecting data from any descendants nodes to their ancestor nodes, through multiple layers.\n\nIt supports:\n\n- pydantic v1\n- pydantic v2\n- dataclass `from pydantic.dataclasses import dataclass`\n\nIf you have experience with GraphQL, this article provides comprehensive insights: [Resolver Pattern: A Better Alternative to GraphQL in BFF (api-integration).](https://github.com/allmonday/resolver-vs-graphql/blob/master/README-en.md)\n\nIt could be seamlessly integrated with modern Python web frameworks including FastAPI, Litestar, and Django-ninja.\n\n## Hello world\n\nHere is the root data we have, it belongs to type BaseStory:\n\n```json\n[\n  { \"id\": 1, \"name\": \"story - 1\" },\n  { \"id\": 2, \"name\": \"story - 2\" }\n]\n```\n\nFirst let's define Story, which inherit from BaseStory, and add tasks field.\n\nAll the details of query process are encapsulated inside the dataloader, no worry about N+1 query.\n\n```python\nfrom pydantic_resolve import Resolver\nfrom biz_models import BaseTask, BaseStory, BaseUser\nfrom biz_services import UserLoader, StoryTaskLoader\n\nclass Story(BaseStory):\n    tasks: list[BaseTask] = []\n    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):\n        return loader.load(self.id)\n\nstories = [Story(**s) for s in await query_stories()]\ndata = await Resolver().resolve(stories)\n```\n\nThen the data looks like:\n\n```json\n[\n  {\n    \"id\": 1,\n    \"name\": \"story - 1\",\n    \"tasks\": [\n      {\n        \"id\": 1,\n        \"name\": \"design\",\n        \"user_id\": 2\n      }\n    ]\n  },\n  {\n    \"id\": 2,\n    \"name\": \"story - 2\",\n    \"tasks\": [\n      {\n        \"id\": 2,\n        \"name\": \"add ut\",\n        \"user_id\": 2\n      }\n    ]\n  }\n]\n```\n\nIf you continue extend the BaseTask and replace the return type of Story.tasks\n\n```python\nclass Task(BaseTask):\n    user: Optional[BaseUser] = None\n    def resolve_user(self, loader=Loader(UserLoader)):\n        return loader.load(self.assignee_id) if self.assignee_id else None\n\nclass Story(BaseStory):\n    tasks: list[Task] = [] # BaseTask -> Task\n    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):\n        return loader.load(self.id)\n```\n\nYou will get the user info immediately.\n\n```json\n[\n  {\n    \"id\": 1,\n    \"name\": \"story - 1\",\n    \"tasks\": [\n      {\n        \"id\": 1,\n        \"name\": \"design\",\n        \"user\": {\n          \"id\": 1,\n          \"name\": \"tangkikodo\"\n        }\n      }\n    ]\n  },\n  {\n    \"id\": 2,\n    \"name\": \"story - 2\",\n    \"tasks\": [\n      {\n        \"id\": 2,\n        \"name\": \"add ut\",\n        \"user\": {\n          \"id\": 2,\n          \"name\": \"john\"\n        }\n      }\n    ]\n  }\n]\n```\n\nWith `openapi-ts` you could transfer those types to the clients seamlessly.\n\n## Installation\n\n```\npip install pydantic-resolve\n```\n\nStarting from pydantic-resolve v1.11.0, both pydantic v1 and v2 are supported.\n\n## Documentation\n\n- **Documentation**: https://allmonday.github.io/pydantic-resolve/\n- **Demo**: https://github.com/allmonday/pydantic-resolve-demo\n- **Composition-Oriented Pattern**: https://github.com/allmonday/composition-oriented-development-pattern\n\n## 3 Steps to construct complex data\n\nLet's take Agile's Story for example.\n\n### 1. Define Domain Models\n\nEstablish entity relationships as foundational data models\n\n(which is stable, serves as architectural blueprint)\n\n<img width=\"630px\" alt=\"image\" src=\"https://github.com/user-attachments/assets/2656f72e-1af5-467a-96f9-cab95760b720\" />\n\n```python\nfrom pydantic import BaseModel\n\nclass BaseStory(BaseModel):\n    id: int\n    name: str\n    assignee_id: Optional[int]\n    report_to: Optional[int]\n\nclass BaseTask(BaseModel):\n    id: int\n    story_id: int\n    name: str\n    estimate: int\n    done: bool\n    assignee_id: Optional[int]\n\nclass BaseUser(BaseModel):\n    id: int\n    name: str\n    title: str\n```\n\n```python\nfrom aiodataloader import DataLoader\nfrom pydantic_resolve import build_list, build_object\n\nclass StoryTaskLoader(DataLoader):\n    async def batch_load_fn(self, keys: list[int]):\n        tasks = await get_tasks_by_story_ids(keys)\n        return build_list(tasks, keys, lambda x: x.story_id)\n\nclass UserLoader(DataLoader):\n    async def batch_load_fn(self, keys: list[int]):\n        users = await get_tuser_by_ids(keys)\n        return build_object(users, keys, lambda x: x.id)\n```\n\nDataLoader implementations support flexible data sources, from database queries to microservice RPC calls. (It could be replaced in future optimization)\n\n### 2. Compose Business Models\n\nBased on a our business logic, create domain-specific data structures through selective schemas and relationship dataloader\n\nWe need to extend `tasks`, `assignee` and `reporter` for `Story`, extend `user` for `Task`\n\nExtending new fields is dynamic, all based on business requirement, but the relationships / loader are restricted by the definition from step 1.\n\n<img width=\"630px\" alt=\"image\" src=\"https://github.com/user-attachments/assets/ffc74e60-0670-475c-85ab-cb0d03460813\" />\n\n```python\nfrom pydantic_resolve import Loader\n\nclass Task(BaseTask):\n    user: Optional[BaseUser] = None\n    def resolve_user(self, loader=Loader(UserLoader)):\n        return loader.load(self.assignee_id) if self.assignee_id else None\n\nclass Story(BaseStory):\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):\n        return loader.load(self.id)\n\n    assignee: Optional[BaseUser] = None\n    def resolve_assignee(self, loader=Loader(UserLoader)):\n        return loader.load(self.assignee_id) if self.assignee_id else None\n\n    reporter: Optional[BaseUser] = None\n    def resolve_reporter(self, loader=Loader(UserLoader)):\n        return loader.load(self.report_to) if self.report_to else None\n```\n\nUtilize `ensure_subset` decorator for field validation and consistency enforcement:\n\n```python\n@ensure_subset(BaseStory)\nclass Story(BaseModel):\n    id: int\n    assignee_id: int\n    report_to: int\n\n    tasks: list[BaseTask] = []\n    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):\n        return loader.load(self.id)\n\n```\n\n> Once this combination is stable, you can consider optimizing with specialized queries to replace DataLoader for enhanced performance, eg ORM's join relationship\n\n### 3. Implement View-Layer Transformations\n\nDataset from data-persistent layer can not meet all requirements, we always need some extra computed fields or adjust the data structure.\n\npost method could read fields from ancestor, collect fields from descendants or modify the data fetched by resolve method.\n\n#### Case 1: Aggregate or collect items\n\n<img width=\"630px\" alt=\"image\" src=\"https://github.com/user-attachments/assets/2e3b1345-9e5e-489b-a81d-dc220b9d6334\" />\n\n`__pydantic_resolve_collect__` can collect fields from current node and then send them to ancestor node who declared `related_users`.\n\n```python\nfrom pydantic_resolve import Loader, Collector\n\nclass Task(BaseTask):\n    __pydantic_resolve_collect__ = {'user': 'related_users'}  # Propagate user to collector: 'related_users'\n\n    user: Optional[BaseUser] = None\n    def resolve_user(self, loader=Loader(UserLoader)):\n        return loader.load(self.assignee_id)\n\nclass Story(BaseStory):\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):\n        return loader.load(self.id)\n\n    assignee: Optional[BaseUser] = None\n    def resolve_assignee(self, loader=Loader(UserLoader)):\n        return loader.load(self.assignee_id)\n\n    reporter: Optional[BaseUser] = None\n    def resolve_reporter(self, loader=Loader(UserLoader)):\n        return loader.load(self.report_to)\n\n    # ---------- Post-processing ------------\n    related_users: list[BaseUser] = []\n    def post_related_users(self, collector=Collector(alias='related_users')):\n        return collector.values()\n```\n\n#### Case 2: Compute extra fields\n\n<img width=\"630px\" alt=\"image\" src=\"https://github.com/user-attachments/assets/fd5897d6-1c6a-49ec-aab0-495070054b83\" />\n\npost methods are executed after all resolve_methods are resolved, so we can use it to calculate extra fields.\n\n```python\nclass Story(BaseStory):\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):\n        return loader.load(self.id)\n\n    assignee: Optional[BaseUser] = None\n    def resolve_assignee(self, loader=Loader(UserLoader)):\n        return loader.load(self.assignee_id)\n\n    reporter: Optional[BaseUser] = None\n    def resolve_reporter(self, loader=Loader(UserLoader)):\n        return loader.load(self.report_to)\n\n    # ---------- Post-processing ------------\n    total_estimate: int = 0\n    def post_total_estimate(self):\n        return sum(task.estimate for task in self.tasks)\n```\n\n### Case 3: Propagate ancestor data through ancestor_context\n\n`__pydantic_resolve_expose__` could expose specific fields from current node to it's descendant.\n\nalias_names should be global unique inside root node.\n\ndescendant nodes could read the value with `ancestor_context[alias_name]`.\n\n```python\nfrom pydantic_resolve import Loader\n\nclass Task(BaseTask):\n    user: Optional[BaseUser] = None\n    def resolve_user(self, loader=Loader(UserLoader)):\n        return loader.load(self.assignee_id)\n\n    # ---------- Post-processing ------------\n    def post_name(self, ancestor_context):  # Access story.name from parent context\n        return f'{ancestor_context['story_name']} - {self.name}'\n\nclass Story(BaseStory):\n    __pydantic_resolve_expose__ = {'name': 'story_name'}\n\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=Loader(StoryTaskLoader)):\n        return loader.load(self.id)\n\n    assignee: Optional[BaseUser] = None\n    def resolve_assignee(self, loader=Loader(UserLoader)):\n        return loader.load(self.assignee_id)\n\n    reporter: Optional[BaseUser] = None\n    def resolve_reporter(self, loader=Loader(UserLoader)):\n        return loader.load(self.report_to)\n```\n\n### 4. Execute Resolver().resolve()\n\n```python\nfrom pydantic_resolve import Resolver\n\nstories = [Story(**s) for s in await query_stories()]\ndata = await Resolver().resolve(stories)\n```\n\n`query_stories()` returns `BaseStory` list, after we transformed it into `Story`, resolve and post fields are initialized as default value, after `Resolver().resolve()` finished, all these fields will be resolved and post-processed to what we expected.\n\n## Testing and Coverage\n\n```shell\ntox\n```\n\n```shell\ntox -e coverage\npython -m http.server\n```\n\nCurrent test coverage: 97%\n\n## Community\n\n[Discord](https://discord.com/channels/1197929379951558797/1197929379951558800)\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A business model friendly data orchestration tool, simple way to implement the core concept of clean architecture.",
    "version": "1.13.0",
    "project_urls": {
        "Homepage": "https://github.com/allmonday/pydantic_resolve",
        "Repository": "https://github.com/allmonday/pydantic_resolve"
    },
    "split_keywords": [
        "pydantic",
        " fastapi"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fc9734517ad7a4a66bd593d90f75fa523cba4bddab404990e3322a639d19eb26",
                "md5": "2b0dc67f54ee8b174a8eba2a3cb557a4",
                "sha256": "49e0ca307aeeb434f7c581211c9a213829ac0d225e5fe0226b4c3ffb4916b07a"
            },
            "downloads": -1,
            "filename": "pydantic_resolve-1.13.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2b0dc67f54ee8b174a8eba2a3cb557a4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.7",
            "size": 26191,
            "upload_time": "2025-08-27T12:15:39",
            "upload_time_iso_8601": "2025-08-27T12:15:39.474010Z",
            "url": "https://files.pythonhosted.org/packages/fc/97/34517ad7a4a66bd593d90f75fa523cba4bddab404990e3322a639d19eb26/pydantic_resolve-1.13.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d7c311c594da8fd59ee3cee8eb252c9c259e748d3b74c0dab3dbe7ee2ff4fdbf",
                "md5": "1da015203143543eeca15f0d2c3fa8fc",
                "sha256": "d3ed009ff05c58fff8d3fd13e1cd4a5d44c5db7c94ef98a11432fa0c5416d601"
            },
            "downloads": -1,
            "filename": "pydantic_resolve-1.13.0.tar.gz",
            "has_sig": false,
            "md5_digest": "1da015203143543eeca15f0d2c3fa8fc",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.7",
            "size": 24116,
            "upload_time": "2025-08-27T12:15:40",
            "upload_time_iso_8601": "2025-08-27T12:15:40.825889Z",
            "url": "https://files.pythonhosted.org/packages/d7/c3/11c594da8fd59ee3cee8eb252c9c259e748d3b74c0dab3dbe7ee2ff4fdbf/pydantic_resolve-1.13.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-27 12:15:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "allmonday",
    "github_project": "pydantic_resolve",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "tox": true,
    "lcname": "pydantic-resolve"
}
        
Elapsed time: 0.84048s