pydantic-resolve


Namepydantic-resolve JSON
Version 1.12.4 PyPI version JSON
download
home_pagehttps://github.com/allmonday/pydantic_resolve
SummaryIt just provide a pair of pre & post methods around pydantic fields, the rest is up to your imagination
upload_time2025-07-12 00:05:54
maintainerNone
docs_urlNone
authortangkikodo
requires_python<4.0,>=3.7
licenseMIT
keywords pydantic fastapi
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            [![pypi](https://img.shields.io/pypi/v/pydantic-resolve.svg)](https://pypi.python.org/pypi/pydantic-resolve)
[![PyPI Downloads](https://static.pepy.tech/badge/pydantic-resolve/month)](https://pepy.tech/projects/pydantic-resolve)
![Python Versions](https://img.shields.io/pypi/pyversions/pydantic-resolve)
[![CI](https://github.com/allmonday/pydantic_resolve/actions/workflows/ci.yml/badge.svg)](https://github.com/allmonday/pydantic_resolve/actions/workflows/ci.yml)

pydantic-resolve is a sophisticated framework for composing complex data structures with an intuitive, resolver-based architecture that eliminates the N+1 query problem.

it supports:
- pydantic v1
- pydantic v2
- dataclass  `from pydantic.dataclasses import dataclass`

```python
class Task(BaseTask):
    user: Optional[BaseUser] = None
    def resolve_user(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.assignee_id) if self.assignee_id else None

class Story(BaseStory):
    tasks: list[Task] = []
    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):
        return loader.load(self.id)

```

If you have experience with GraphQL, this article provides comprehensive insights: [Resolver Pattern: A Better Alternative to GraphQL in BFF.](https://github.com/allmonday/resolver-vs-graphql/blob/master/README-en.md)

The framework enables progressive data enrichment through incremental field resolution, allowing seamless API evolution from flat to hierarchical data structures.

Extend your data models by implementing `resolve_field` methods for data fetching and `post_field` methods for transformations, enabling node creation, in-place modifications, or cross-node data aggregation.

Seamlessly integrates with modern Python web frameworks including FastAPI, Litestar, and Django-ninja.


## Installation

```
pip install pydantic-resolve
```

Starting from pydantic-resolve v1.11.0, both pydantic v1 and v2 are supported.

## Documentation

- **Documentation**: https://allmonday.github.io/pydantic-resolve/v2/introduction/
- **Demo Repository**: https://github.com/allmonday/pydantic-resolve-demo
- **Composition-Oriented Pattern**: https://github.com/allmonday/composition-oriented-development-pattern

## Architecture Overview

Building complex data structures requires only 3 systematic steps:

### 1. Define Domain Models

Establish entity relationships as foundational data models (stable, serves as architectural blueprint)

<img width="639" alt="image" src="https://github.com/user-attachments/assets/2656f72e-1af5-467a-96f9-cab95760b720" />

```python
from pydantic import BaseModel

class BaseStory(BaseModel):
    id: int
    name: str
    assignee_id: Optional[int]
    report_to: Optional[int]

class BaseTask(BaseModel):
    id: int
    story_id: int
    name: str
    estimate: int
    done: bool
    assignee_id: Optional[int]

class BaseUser(BaseModel):
    id: int
    name: str
    title: str
```

```python
from aiodataloader import DataLoader
from pydantic_resolve import build_list, build_object

class StoryTaskLoader(DataLoader):
    async def batch_load_fn(self, keys: list[int]):
        tasks = await get_tasks_by_story_ids(keys)
        return build_list(tasks, keys, lambda x: x.story_id)

class UserLoader(DataLoader):
    async def batch_load_fn(self, keys: list[int]):
        users = await get_tuser_by_ids(keys)
        return build_object(users, keys, lambda x: x.id)
```

DataLoader implementations support flexible data sources, from database queries to microservice RPC calls.

### 2. Compose Business Models

Create domain-specific data structures through selective composition and relationship mapping (stable, reusable across use cases)

<img width="709" alt="image" src="https://github.com/user-attachments/assets/ffc74e60-0670-475c-85ab-cb0d03460813" />

```python
from pydantic_resolve import LoaderDepend

class Task(BaseTask):
    user: Optional[BaseUser] = None
    def resolve_user(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.assignee_id) if self.assignee_id else None

class Story(BaseStory):
    tasks: list[Task] = []
    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):
        return loader.load(self.id)

    assignee: Optional[BaseUser] = None
    def resolve_assignee(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.assignee_id) if self.assignee_id else None

    reporter: Optional[BaseUser] = None
    def resolve_reporter(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.report_to) if self.report_to else None
```

Utilize `ensure_subset` decorator for field validation and consistency enforcement:

```python
@ensure_subset(BaseStory)
class Story(BaseModel):
    id: int
    assignee_id: int
    report_to: int

    tasks: list[BaseTask] = []
    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):
        return loader.load(self.id)

```

> Once business models are validated, consider optimizing with specialized queries to replace DataLoader for enhanced performance.

### 3. Implement View-Layer Transformations

Apply presentation-specific modifications and data aggregations (flexible, context-dependent)

Leverage post_field methods for ancestor data access, node transfers, and in-place transformations.

#### Pattern 1: Aggregate Related Entities

<img width="701" alt="image" src="https://github.com/user-attachments/assets/2e3b1345-9e5e-489b-a81d-dc220b9d6334" />

```python
from pydantic_resolve import LoaderDepend, Collector

class Task(BaseTask):
    __pydantic_resolve_collect__ = {'user': 'related_users'}  # Propagate user to collector: 'related_users'

    user: Optional[BaseUser] = None
    def resolve_user(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.assignee_id)

class Story(BaseStory):
    tasks: list[Task] = []
    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):
        return loader.load(self.id)

    assignee: Optional[BaseUser] = None
    def resolve_assignee(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.assignee_id)

    reporter: Optional[BaseUser] = None
    def resolve_reporter(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.report_to)

    # ---------- Post-processing ------------
    related_users: list[BaseUser] = []
    def post_related_users(self, collector=Collector(alias='related_users')):
        return collector.values()
```

#### Pattern 2: Compute Derived Metrics

<img width="687" alt="image" src="https://github.com/user-attachments/assets/fd5897d6-1c6a-49ec-aab0-495070054b83" />

```python
class Story(BaseStory):
    tasks: list[Task] = []
    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):
        return loader.load(self.id)

    assignee: Optional[BaseUser] = None
    def resolve_assignee(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.assignee_id)

    reporter: Optional[BaseUser] = None
    def resolve_reporter(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.report_to)

    # ---------- Post-processing ------------
    total_estimate: int = 0
    def post_total_estimate(self):
        return sum(task.estimate for task in self.tasks)
```

### Pattern 3: Propagate Ancestor Context

```python
from pydantic_resolve import LoaderDepend

class Task(BaseTask):
    user: Optional[BaseUser] = None
    def resolve_user(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.assignee_id)

    # ---------- Post-processing ------------
    def post_name(self, ancestor_context):  # Access story.name from parent context
        return f'{ancestor_context['story_name']} - {self.name}'

class Story(BaseStory):
    __pydantic_resolve_expose__ = {'name': 'story_name'}

    tasks: list[Task] = []
    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):
        return loader.load(self.id)

    assignee: Optional[BaseUser] = None
    def resolve_assignee(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.assignee_id)

    reporter: Optional[BaseUser] = None
    def resolve_reporter(self, loader=LoaderDepend(UserLoader)):
        return loader.load(self.report_to)
```

### 4. Execute Resolution Pipeline

```python
from pydantic_resolve import Resolver

stories: List[Story] = await query_stories()
await Resolver().resolve(stories)
```

Resolution complete!

## Technical Architecture

The framework significantly reduces complexity in data composition by maintaining alignment with entity-relationship models, resulting in enhanced maintainability.

> Utilizing an ER-oriented modeling approach delivers 3-5x development efficiency gains and 50%+ code reduction.

Leveraging pydantic's capabilities, it enables GraphQL-like hierarchical data structures while providing flexible business logic integration during data resolution.

Seamlessly integrates with FastAPI to construct frontend-optimized data structures and generate TypeScript SDKs for type-safe client integration.

The core architecture provides `resolve` and `post` method hooks for pydantic and dataclass objects:

- `resolve`: Handles data fetching operations
- `post`: Executes post-processing transformations

This implements a recursive resolution pipeline that completes when all descendant nodes are processed.

![](docs/images/life-cycle.png)

Consider the Sprint, Story, and Task relationship hierarchy:

<img src="docs/images/real-sample.png" style="width: 600px"/>

Upon object instantiation with defined methods, pydantic-resolve traverses the data graph, executes resolution methods, and produces the complete data structure.

DataLoader integration eliminates N+1 query problems inherent in multi-level data fetching, optimizing performance characteristics.

DataLoader architecture enables modular class composition and reusability across different contexts.

Additionally, the framework provides expose and collector mechanisms for sophisticated cross-layer data processing patterns.


## Testing and Coverage

```shell
tox
```

```shell
tox -e coverage
python -m http.server
```

Current test coverage: 97%

## Benchmark

`ab -c 50 -n 1000` based on FastAPI.

strawberry-graphql

```
Server Software:        uvicorn
Server Hostname:        localhost
Server Port:            8000

Document Path:          /graphql
Document Length:        5303 bytes

Concurrency Level:      50
Time taken for tests:   3.630 seconds
Complete requests:      1000
Failed requests:        0
Total transferred:      5430000 bytes
Total body sent:        395000
HTML transferred:       5303000 bytes
Requests per second:    275.49 [#/sec] (mean)
Time per request:       181.498 [ms] (mean)
Time per request:       3.630 [ms] (mean, across all concurrent requests)
Transfer rate:          1460.82 [Kbytes/sec] received
                        106.27 kb/s sent
                        1567.09 kb/s total

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.2      0       1
Processing:    31  178  14.3    178     272
Waiting:       30  176  14.3    176     270
Total:         31  178  14.4    179     273
```

pydantic-resolve

```
Server Software: uvicorn
Server Hostname: localhost
Server Port: 8000

Document Path: /sprints
Document Length: 4621 bytes

Concurrency Level: 50
Time taken for tests: 2.194 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 4748000 bytes
HTML transferred: 4621000 bytes
Requests per second: 455.79 [#/sec] (mean)
Time per request: 109.700 [ms] (mean)
Time per request: 2.194 [ms] (mean, across all concurrent requests)
Transfer rate: 2113.36 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.3 0 1
Processing: 30 107 10.9 106 138
Waiting: 28 105 10.7 104 138
Total: 30 107 11.0 106 140
```

## Community

[Discord](https://discord.com/channels/1197929379951558797/1197929379951558800)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/allmonday/pydantic_resolve",
    "name": "pydantic-resolve",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.7",
    "maintainer_email": null,
    "keywords": "pydantic, fastapi",
    "author": "tangkikodo",
    "author_email": "allmonday@126.com",
    "download_url": "https://files.pythonhosted.org/packages/3a/b8/97c8c068cc4044407ac3473eb3a0cc9e7c4ee48dd6057c1e7c155582c372/pydantic_resolve-1.12.4.tar.gz",
    "platform": null,
    "description": "[![pypi](https://img.shields.io/pypi/v/pydantic-resolve.svg)](https://pypi.python.org/pypi/pydantic-resolve)\n[![PyPI Downloads](https://static.pepy.tech/badge/pydantic-resolve/month)](https://pepy.tech/projects/pydantic-resolve)\n![Python Versions](https://img.shields.io/pypi/pyversions/pydantic-resolve)\n[![CI](https://github.com/allmonday/pydantic_resolve/actions/workflows/ci.yml/badge.svg)](https://github.com/allmonday/pydantic_resolve/actions/workflows/ci.yml)\n\npydantic-resolve is a sophisticated framework for composing complex data structures with an intuitive, resolver-based architecture that eliminates the N+1 query problem.\n\nit supports:\n- pydantic v1\n- pydantic v2\n- dataclass  `from pydantic.dataclasses import dataclass`\n\n```python\nclass Task(BaseTask):\n    user: Optional[BaseUser] = None\n    def resolve_user(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.assignee_id) if self.assignee_id else None\n\nclass Story(BaseStory):\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):\n        return loader.load(self.id)\n\n```\n\nIf you have experience with GraphQL, this article provides comprehensive insights: [Resolver Pattern: A Better Alternative to GraphQL in BFF.](https://github.com/allmonday/resolver-vs-graphql/blob/master/README-en.md)\n\nThe framework enables progressive data enrichment through incremental field resolution, allowing seamless API evolution from flat to hierarchical data structures.\n\nExtend your data models by implementing `resolve_field` methods for data fetching and `post_field` methods for transformations, enabling node creation, in-place modifications, or cross-node data aggregation.\n\nSeamlessly integrates with modern Python web frameworks including FastAPI, Litestar, and Django-ninja.\n\n\n## Installation\n\n```\npip install pydantic-resolve\n```\n\nStarting from pydantic-resolve v1.11.0, both pydantic v1 and v2 are supported.\n\n## Documentation\n\n- **Documentation**: https://allmonday.github.io/pydantic-resolve/v2/introduction/\n- **Demo Repository**: https://github.com/allmonday/pydantic-resolve-demo\n- **Composition-Oriented Pattern**: https://github.com/allmonday/composition-oriented-development-pattern\n\n## Architecture Overview\n\nBuilding complex data structures requires only 3 systematic steps:\n\n### 1. Define Domain Models\n\nEstablish entity relationships as foundational data models (stable, serves as architectural blueprint)\n\n<img width=\"639\" alt=\"image\" src=\"https://github.com/user-attachments/assets/2656f72e-1af5-467a-96f9-cab95760b720\" />\n\n```python\nfrom pydantic import BaseModel\n\nclass BaseStory(BaseModel):\n    id: int\n    name: str\n    assignee_id: Optional[int]\n    report_to: Optional[int]\n\nclass BaseTask(BaseModel):\n    id: int\n    story_id: int\n    name: str\n    estimate: int\n    done: bool\n    assignee_id: Optional[int]\n\nclass BaseUser(BaseModel):\n    id: int\n    name: str\n    title: str\n```\n\n```python\nfrom aiodataloader import DataLoader\nfrom pydantic_resolve import build_list, build_object\n\nclass StoryTaskLoader(DataLoader):\n    async def batch_load_fn(self, keys: list[int]):\n        tasks = await get_tasks_by_story_ids(keys)\n        return build_list(tasks, keys, lambda x: x.story_id)\n\nclass UserLoader(DataLoader):\n    async def batch_load_fn(self, keys: list[int]):\n        users = await get_tuser_by_ids(keys)\n        return build_object(users, keys, lambda x: x.id)\n```\n\nDataLoader implementations support flexible data sources, from database queries to microservice RPC calls.\n\n### 2. Compose Business Models\n\nCreate domain-specific data structures through selective composition and relationship mapping (stable, reusable across use cases)\n\n<img width=\"709\" alt=\"image\" src=\"https://github.com/user-attachments/assets/ffc74e60-0670-475c-85ab-cb0d03460813\" />\n\n```python\nfrom pydantic_resolve import LoaderDepend\n\nclass Task(BaseTask):\n    user: Optional[BaseUser] = None\n    def resolve_user(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.assignee_id) if self.assignee_id else None\n\nclass Story(BaseStory):\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):\n        return loader.load(self.id)\n\n    assignee: Optional[BaseUser] = None\n    def resolve_assignee(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.assignee_id) if self.assignee_id else None\n\n    reporter: Optional[BaseUser] = None\n    def resolve_reporter(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.report_to) if self.report_to else None\n```\n\nUtilize `ensure_subset` decorator for field validation and consistency enforcement:\n\n```python\n@ensure_subset(BaseStory)\nclass Story(BaseModel):\n    id: int\n    assignee_id: int\n    report_to: int\n\n    tasks: list[BaseTask] = []\n    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):\n        return loader.load(self.id)\n\n```\n\n> Once business models are validated, consider optimizing with specialized queries to replace DataLoader for enhanced performance.\n\n### 3. Implement View-Layer Transformations\n\nApply presentation-specific modifications and data aggregations (flexible, context-dependent)\n\nLeverage post_field methods for ancestor data access, node transfers, and in-place transformations.\n\n#### Pattern 1: Aggregate Related Entities\n\n<img width=\"701\" alt=\"image\" src=\"https://github.com/user-attachments/assets/2e3b1345-9e5e-489b-a81d-dc220b9d6334\" />\n\n```python\nfrom pydantic_resolve import LoaderDepend, Collector\n\nclass Task(BaseTask):\n    __pydantic_resolve_collect__ = {'user': 'related_users'}  # Propagate user to collector: 'related_users'\n\n    user: Optional[BaseUser] = None\n    def resolve_user(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.assignee_id)\n\nclass Story(BaseStory):\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):\n        return loader.load(self.id)\n\n    assignee: Optional[BaseUser] = None\n    def resolve_assignee(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.assignee_id)\n\n    reporter: Optional[BaseUser] = None\n    def resolve_reporter(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.report_to)\n\n    # ---------- Post-processing ------------\n    related_users: list[BaseUser] = []\n    def post_related_users(self, collector=Collector(alias='related_users')):\n        return collector.values()\n```\n\n#### Pattern 2: Compute Derived Metrics\n\n<img width=\"687\" alt=\"image\" src=\"https://github.com/user-attachments/assets/fd5897d6-1c6a-49ec-aab0-495070054b83\" />\n\n```python\nclass Story(BaseStory):\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):\n        return loader.load(self.id)\n\n    assignee: Optional[BaseUser] = None\n    def resolve_assignee(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.assignee_id)\n\n    reporter: Optional[BaseUser] = None\n    def resolve_reporter(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.report_to)\n\n    # ---------- Post-processing ------------\n    total_estimate: int = 0\n    def post_total_estimate(self):\n        return sum(task.estimate for task in self.tasks)\n```\n\n### Pattern 3: Propagate Ancestor Context\n\n```python\nfrom pydantic_resolve import LoaderDepend\n\nclass Task(BaseTask):\n    user: Optional[BaseUser] = None\n    def resolve_user(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.assignee_id)\n\n    # ---------- Post-processing ------------\n    def post_name(self, ancestor_context):  # Access story.name from parent context\n        return f'{ancestor_context['story_name']} - {self.name}'\n\nclass Story(BaseStory):\n    __pydantic_resolve_expose__ = {'name': 'story_name'}\n\n    tasks: list[Task] = []\n    def resolve_tasks(self, loader=LoaderDepend(StoryTaskLoader)):\n        return loader.load(self.id)\n\n    assignee: Optional[BaseUser] = None\n    def resolve_assignee(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.assignee_id)\n\n    reporter: Optional[BaseUser] = None\n    def resolve_reporter(self, loader=LoaderDepend(UserLoader)):\n        return loader.load(self.report_to)\n```\n\n### 4. Execute Resolution Pipeline\n\n```python\nfrom pydantic_resolve import Resolver\n\nstories: List[Story] = await query_stories()\nawait Resolver().resolve(stories)\n```\n\nResolution complete!\n\n## Technical Architecture\n\nThe framework significantly reduces complexity in data composition by maintaining alignment with entity-relationship models, resulting in enhanced maintainability.\n\n> Utilizing an ER-oriented modeling approach delivers 3-5x development efficiency gains and 50%+ code reduction.\n\nLeveraging pydantic's capabilities, it enables GraphQL-like hierarchical data structures while providing flexible business logic integration during data resolution.\n\nSeamlessly integrates with FastAPI to construct frontend-optimized data structures and generate TypeScript SDKs for type-safe client integration.\n\nThe core architecture provides `resolve` and `post` method hooks for pydantic and dataclass objects:\n\n- `resolve`: Handles data fetching operations\n- `post`: Executes post-processing transformations\n\nThis implements a recursive resolution pipeline that completes when all descendant nodes are processed.\n\n![](docs/images/life-cycle.png)\n\nConsider the Sprint, Story, and Task relationship hierarchy:\n\n<img src=\"docs/images/real-sample.png\" style=\"width: 600px\"/>\n\nUpon object instantiation with defined methods, pydantic-resolve traverses the data graph, executes resolution methods, and produces the complete data structure.\n\nDataLoader integration eliminates N+1 query problems inherent in multi-level data fetching, optimizing performance characteristics.\n\nDataLoader architecture enables modular class composition and reusability across different contexts.\n\nAdditionally, the framework provides expose and collector mechanisms for sophisticated cross-layer data processing patterns.\n\n\n## Testing and Coverage\n\n```shell\ntox\n```\n\n```shell\ntox -e coverage\npython -m http.server\n```\n\nCurrent test coverage: 97%\n\n## Benchmark\n\n`ab -c 50 -n 1000` based on FastAPI.\n\nstrawberry-graphql\n\n```\nServer Software:        uvicorn\nServer Hostname:        localhost\nServer Port:            8000\n\nDocument Path:          /graphql\nDocument Length:        5303 bytes\n\nConcurrency Level:      50\nTime taken for tests:   3.630 seconds\nComplete requests:      1000\nFailed requests:        0\nTotal transferred:      5430000 bytes\nTotal body sent:        395000\nHTML transferred:       5303000 bytes\nRequests per second:    275.49 [#/sec] (mean)\nTime per request:       181.498 [ms] (mean)\nTime per request:       3.630 [ms] (mean, across all concurrent requests)\nTransfer rate:          1460.82 [Kbytes/sec] received\n                        106.27 kb/s sent\n                        1567.09 kb/s total\n\nConnection Times (ms)\n              min  mean[+/-sd] median   max\nConnect:        0    0   0.2      0       1\nProcessing:    31  178  14.3    178     272\nWaiting:       30  176  14.3    176     270\nTotal:         31  178  14.4    179     273\n```\n\npydantic-resolve\n\n```\nServer Software: uvicorn\nServer Hostname: localhost\nServer Port: 8000\n\nDocument Path: /sprints\nDocument Length: 4621 bytes\n\nConcurrency Level: 50\nTime taken for tests: 2.194 seconds\nComplete requests: 1000\nFailed requests: 0\nTotal transferred: 4748000 bytes\nHTML transferred: 4621000 bytes\nRequests per second: 455.79 [#/sec] (mean)\nTime per request: 109.700 [ms] (mean)\nTime per request: 2.194 [ms] (mean, across all concurrent requests)\nTransfer rate: 2113.36 [Kbytes/sec] received\n\nConnection Times (ms)\nmin mean[+/-sd] median max\nConnect: 0 0 0.3 0 1\nProcessing: 30 107 10.9 106 138\nWaiting: 28 105 10.7 104 138\nTotal: 30 107 11.0 106 140\n```\n\n## Community\n\n[Discord](https://discord.com/channels/1197929379951558797/1197929379951558800)\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "It just provide a pair of pre & post methods around pydantic fields, the rest is up to your imagination",
    "version": "1.12.4",
    "project_urls": {
        "Homepage": "https://github.com/allmonday/pydantic_resolve",
        "Repository": "https://github.com/allmonday/pydantic_resolve"
    },
    "split_keywords": [
        "pydantic",
        " fastapi"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a3eeb54c6106bbdc01395769bcf9b42b980921aa35632912c7a53e4cd10a57a2",
                "md5": "3c400c1d081b6d4f2ac6a6091ae55769",
                "sha256": "f98040f57dbc4579625214905390904cc55093c1225ffc9dc4e02fc0d845bc78"
            },
            "downloads": -1,
            "filename": "pydantic_resolve-1.12.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3c400c1d081b6d4f2ac6a6091ae55769",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.7",
            "size": 25898,
            "upload_time": "2025-07-12T00:05:52",
            "upload_time_iso_8601": "2025-07-12T00:05:52.595956Z",
            "url": "https://files.pythonhosted.org/packages/a3/ee/b54c6106bbdc01395769bcf9b42b980921aa35632912c7a53e4cd10a57a2/pydantic_resolve-1.12.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3ab897c8c068cc4044407ac3473eb3a0cc9e7c4ee48dd6057c1e7c155582c372",
                "md5": "8b42c6ea9fdb721535c43bbce4876f97",
                "sha256": "2be83c51d738e7345ff148d65659f6c5358972743b7a6e75586ef4ffd40bf593"
            },
            "downloads": -1,
            "filename": "pydantic_resolve-1.12.4.tar.gz",
            "has_sig": false,
            "md5_digest": "8b42c6ea9fdb721535c43bbce4876f97",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.7",
            "size": 24181,
            "upload_time": "2025-07-12T00:05:54",
            "upload_time_iso_8601": "2025-07-12T00:05:54.530086Z",
            "url": "https://files.pythonhosted.org/packages/3a/b8/97c8c068cc4044407ac3473eb3a0cc9e7c4ee48dd6057c1e7c155582c372/pydantic_resolve-1.12.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-12 00:05:54",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "allmonday",
    "github_project": "pydantic_resolve",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "tox": true,
    "lcname": "pydantic-resolve"
}
        
Elapsed time: 0.57324s