fastapi-mock-service


Namefastapi-mock-service JSON
Version 1.0.7 PyPI version JSON
download
home_pagehttps://gitlab.com/eastden4ik/fastapimockserver
SummaryProfessional mock service library with load testing infrastructure for FastAPI
upload_time2025-10-17 09:49:07
maintainerNone
docs_urlNone
authorDenis Sviridov
requires_python>=3.8
licenseMIT
keywords fastapi mock testing api load-testing prometheus dashboard
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # FastAPI Mock Service

[![PyPI version](https://badge.fury.io/py/fastapi-mock-service.svg)](https://badge.fury.io/py/fastapi-mock-service)
[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

Professional mock service library with load testing infrastructure for FastAPI. Create powerful mock APIs with built-in
monitoring, metrics collection, and interactive dashboard.

## ๐Ÿš€ Features

- **Easy Mock Creation**: Simple decorators to create mock endpoints
- **Built-in Dashboard**: Real-time monitoring with interactive web interface
- **Prometheus Metrics**: Comprehensive metrics collection for performance analysis
- **Load Testing Support**: Built-in infrastructure for load testing mock services
- **Database Integration**: SQLite database for test results storage
- **Flexible Response Configuration**: Support for complex response scenarios
- **CLI Tool**: Command-line interface for quick setup and management
- **Auto Validation**: Automatic parameter validation with error handling

## ๐Ÿ“ฆ Installation

```bash
pip install fastapi-mock-service
```

## ๐ŸŽฏ Quick Start

### 1. Basic Usage

```python
from fastapi_mock_service import MockService
from pydantic import BaseModel

# Create mock service
mock = MockService()

class User(BaseModel):
    id: int
    name: str
    email: str

# Create mock endpoints
@mock.get("/api/users/{user_id}")
def get_user(user_id: int):
    return User(
        id=user_id,
        name=f"User {user_id}",
        email=f"user{user_id}@example.com"
    )

@mock.post("/api/users")
def create_user(user: User):
    return {"message": "User created", "user": user}

if __name__ == "__main__":
    mock.run()
```

### 2. Using CLI

```bash
# Create example file
fastapi-mock init my_mock.py

# Create advanced example with error codes
fastapi-mock init advanced_mock.py --advanced

# Run mock service
fastapi-mock run my_mock.py

# Run on custom port
fastapi-mock run my_mock.py --port 9000

# Test session management (independent of admin interface)
fastapi-mock test start --name "Load Test"    # Start test session
fastapi-mock test stop --session-id <id>      # Stop specific test session
fastapi-mock test status                      # Get current test session status
fastapi-mock test stop --force                # Force stop current test

# Mock endpoint control (independent of test sessions)
fastapi-mock mock activate                    # Activate mock endpoints
fastapi-mock mock deactivate                  # Deactivate mock endpoints
fastapi-mock mock status                      # Get mock endpoints status

# Start with mocks deactivated
fastapi-mock run my_mock.py --no-mocks        # Run service with inactive mocks
```

### 3. Advanced Usage with Error Codes

```python
from fastapi_mock_service import MockService
from pydantic import BaseModel
from typing import Optional
from datetime import datetime, timezone

mock = MockService()

# Define error codes
API_ERRORS = {
    "validation": {"code": "API.01000", "message": "Validation error"},
    "not_found": {"code": "API.01001", "message": "Resource not found"},
    "server_error": {"code": "API.01003", "message": "Internal server error"},
}

class StandardResult(BaseModel):
    timestamp: str
    status: int
    code: str
    message: str

class UserResponse(BaseModel):
    result: StandardResult
    data: Optional[dict] = None

def make_result(success: bool = True, error_key: Optional[str] = None) -> StandardResult:
    dt = datetime.now(timezone.utc).isoformat()
    if success:
        return StandardResult(
            timestamp=dt, status=200, code="API.00000", message="OK"
        )
    else:
        error_info = API_ERRORS.get(error_key, API_ERRORS["server_error"])
        return StandardResult(
            timestamp=dt, status=200, 
            code=error_info["code"], message=error_info["message"]
        )

@mock.get("/api/v1/users/{user_id}")
def get_user_advanced(user_id: int):
    if user_id <= 0:
        return UserResponse(result=make_result(False, "validation"))
    
    if user_id > 1000:
        return UserResponse(result=make_result(False, "not_found"))
    
    # Success response
    user_data = {"id": user_id, "name": f"User {user_id}"}
    return UserResponse(result=make_result(True), data=user_data)

if __name__ == "__main__":
    mock.run()
```

## ๐ŸŒ Dashboard & Monitoring

Once your mock service is running, access:

- **๐Ÿ“Š Dashboard**: `http://localhost:8000` - Interactive monitoring interface
- **๐Ÿ“ˆ Metrics**: `http://localhost:8000/metrics` - Prometheus metrics endpoint
- **๐Ÿ“š API Docs**: `http://localhost:8000/docs` - Auto-generated API documentation

### Dashboard Features

- **Real-time Logs**: View incoming requests and responses
- **Metrics Visualization**: Charts for request counts, response times, and error rates
- **Test Management**: Start/stop load testing sessions
- **Endpoint Overview**: List of all registered mock endpoints
- **Test Results History**: Historical test results with detailed summaries
- **Independent Test Sessions**: Test sessions continue running even after closing the admin interface
- **Separate Mock Control**: Activate/deactivate mock endpoints independently of test sessions
- **Real-time Metrics**: Live updating charts and statistics

## ๐Ÿงช Load Testing

The service includes built-in load testing capabilities with independent test session management:

```python
# Your mock service automatically includes testing endpoints
# POST /api/start-test - Start a new test session
# POST /api/stop-test - Stop test and generate summary
# POST /api/reset-metrics - Reset all metrics
# GET /api/test-session-status - Get current test session status
# POST /api/stop-test-session - Stop test session by ID

# Example: Start test via HTTP
import httpx

# Start test in independent mode
response = httpx.post("http://localhost:8000/api/start-test",
                     json={"test_name": "Performance Test", "independent_mode": True})

# Get session ID from response
session_id = response.json()["test_session_id"]

# Your load testing tool hits the mock endpoints
# ... run your load tests ...

# Stop test and get results
results = httpx.post("http://localhost:8000/api/stop-test-session",
                    json={"test_session_id": session_id}).json()
print(results["summary"])

# Test continues even if admin interface is closed!
```

### CLI Test Management

For complete independence from the web interface, use the CLI commands:

```bash
# Start a test session
fastapi-mock test start --name "Load Test" --host localhost --port 8000

# Get test session status
fastapi-mock test status --host localhost --port 8000

# Stop the test session
fastapi-mock test stop --session-id <session_id> --host localhost --port 8000

# Force stop current test (no session ID needed)
fastapi-mock test stop --force --host localhost --port 8000

# Control mock endpoints independently
fastapi-mock mock activate --host localhost --port 8000
fastapi-mock mock deactivate --host localhost --port 8000
fastapi-mock mock status --host localhost --port 8000

# Start service with deactivated mocks
fastapi-mock run my_mock.py --no-mocks
```

## ๐Ÿ“Š Metrics Collection

Automatic metrics collection includes:

- **Request Count**: Total requests per endpoint
- **Response Time**: Histogram of response times
- **Status Codes**: Distribution of response codes
- **Error Rates**: Success/failure ratios
- **Custom Result Codes**: Application-specific result codes

## ๐Ÿ› ๏ธ API Reference

### MockService Class

```python
class MockService:
    def __init__(self, db_url: str = "sqlite://test_results.db"):
        """Initialize mock service with optional database URL"""
    
    def get(self, path: str = None, responses: list = None, tags: list = None):
        """GET endpoint decorator"""
    
    def post(self, path: str = None, responses: list = None, tags: list = None):
        """POST endpoint decorator"""
    
    def put(self, path: str = None, responses: list = None, tags: list = None):
        """PUT endpoint decorator"""
    
    def delete(self, path: str = None, responses: list = None, tags: list = None):
        """DELETE endpoint decorator"""
    
    def run(self, host: str = "0.0.0.0", port: int = 8000, **kwargs):
        """Run the mock service"""
```

### Decorator Parameters

- **path**: URL path for the endpoint (defaults to function name)
- **responses**: List of possible responses for documentation
- **tags**: Tags for grouping endpoints in UI
- **validation_error_handler**: Custom validation error handler

## ๐Ÿ“ Examples

### Example 1: REST API Mock

```python
from fastapi_mock_service import MockService

mock = MockService()

@mock.get("/api/products/{product_id}")
def get_product(product_id: int, include_details: bool = False):
    product = {"id": product_id, "name": f"Product {product_id}"}
    if include_details:
        product["description"] = f"Description for product {product_id}"
    return product

@mock.get("/api/products")
def list_products(category: str = "all", limit: int = 10):
    return {
        "products": [{"id": i, "name": f"Product {i}"} for i in range(1, limit + 1)],
        "category": category,
        "total": limit
    }

mock.run()
```

### Example 2: With Custom Headers

```python
from fastapi_mock_service import MockService
from fastapi import Header

mock = MockService()

@mock.get("/api/secure/data")
def get_secure_data(authorization: str = Header(...)):
    if not authorization.startswith("Bearer "):
        return {"error": "Invalid authorization header"}
    
    return {"data": "sensitive information", "user": "authenticated"}

mock.run()
```

## ๐Ÿค Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

## ๐Ÿ“„ License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## ๐Ÿ“ž Support

- **Issues**: [GitLab Issues](https://gitlab.com/eastden4ik/fastapimockserver/-/issues)
- **Email**: Sviridov.DS@bk.ru

## Acknowledgments
Built with:

- FastAPI - Modern, fast web framework
- Prometheus Client - Metrics collection
- Tortoise ORM - Async ORM for test results
- Chart.js - Interactive charts in dashboard


------------------------------------------
##### Made with โค๏ธ for the API development and testing community

            

Raw data

            {
    "_id": null,
    "home_page": "https://gitlab.com/eastden4ik/fastapimockserver",
    "name": "fastapi-mock-service",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "fastapi, mock, testing, api, load-testing, prometheus, dashboard",
    "author": "Denis Sviridov",
    "author_email": "Denis Sviridov <Sviridov.DS@bk.ru>",
    "download_url": "https://files.pythonhosted.org/packages/5a/8c/d13bfcd94817eda7ea1cabdc733aff2527ca5ce5f4fba4a6b2bfae024421/fastapi_mock_service-1.0.7.tar.gz",
    "platform": null,
    "description": "# FastAPI Mock Service\n\n[![PyPI version](https://badge.fury.io/py/fastapi-mock-service.svg)](https://badge.fury.io/py/fastapi-mock-service)\n[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n\nProfessional mock service library with load testing infrastructure for FastAPI. Create powerful mock APIs with built-in\nmonitoring, metrics collection, and interactive dashboard.\n\n## \ud83d\ude80 Features\n\n- **Easy Mock Creation**: Simple decorators to create mock endpoints\n- **Built-in Dashboard**: Real-time monitoring with interactive web interface\n- **Prometheus Metrics**: Comprehensive metrics collection for performance analysis\n- **Load Testing Support**: Built-in infrastructure for load testing mock services\n- **Database Integration**: SQLite database for test results storage\n- **Flexible Response Configuration**: Support for complex response scenarios\n- **CLI Tool**: Command-line interface for quick setup and management\n- **Auto Validation**: Automatic parameter validation with error handling\n\n## \ud83d\udce6 Installation\n\n```bash\npip install fastapi-mock-service\n```\n\n## \ud83c\udfaf Quick Start\n\n### 1. Basic Usage\n\n```python\nfrom fastapi_mock_service import MockService\nfrom pydantic import BaseModel\n\n# Create mock service\nmock = MockService()\n\nclass User(BaseModel):\n    id: int\n    name: str\n    email: str\n\n# Create mock endpoints\n@mock.get(\"/api/users/{user_id}\")\ndef get_user(user_id: int):\n    return User(\n        id=user_id,\n        name=f\"User {user_id}\",\n        email=f\"user{user_id}@example.com\"\n    )\n\n@mock.post(\"/api/users\")\ndef create_user(user: User):\n    return {\"message\": \"User created\", \"user\": user}\n\nif __name__ == \"__main__\":\n    mock.run()\n```\n\n### 2. Using CLI\n\n```bash\n# Create example file\nfastapi-mock init my_mock.py\n\n# Create advanced example with error codes\nfastapi-mock init advanced_mock.py --advanced\n\n# Run mock service\nfastapi-mock run my_mock.py\n\n# Run on custom port\nfastapi-mock run my_mock.py --port 9000\n\n# Test session management (independent of admin interface)\nfastapi-mock test start --name \"Load Test\"    # Start test session\nfastapi-mock test stop --session-id <id>      # Stop specific test session\nfastapi-mock test status                      # Get current test session status\nfastapi-mock test stop --force                # Force stop current test\n\n# Mock endpoint control (independent of test sessions)\nfastapi-mock mock activate                    # Activate mock endpoints\nfastapi-mock mock deactivate                  # Deactivate mock endpoints\nfastapi-mock mock status                      # Get mock endpoints status\n\n# Start with mocks deactivated\nfastapi-mock run my_mock.py --no-mocks        # Run service with inactive mocks\n```\n\n### 3. Advanced Usage with Error Codes\n\n```python\nfrom fastapi_mock_service import MockService\nfrom pydantic import BaseModel\nfrom typing import Optional\nfrom datetime import datetime, timezone\n\nmock = MockService()\n\n# Define error codes\nAPI_ERRORS = {\n    \"validation\": {\"code\": \"API.01000\", \"message\": \"Validation error\"},\n    \"not_found\": {\"code\": \"API.01001\", \"message\": \"Resource not found\"},\n    \"server_error\": {\"code\": \"API.01003\", \"message\": \"Internal server error\"},\n}\n\nclass StandardResult(BaseModel):\n    timestamp: str\n    status: int\n    code: str\n    message: str\n\nclass UserResponse(BaseModel):\n    result: StandardResult\n    data: Optional[dict] = None\n\ndef make_result(success: bool = True, error_key: Optional[str] = None) -> StandardResult:\n    dt = datetime.now(timezone.utc).isoformat()\n    if success:\n        return StandardResult(\n            timestamp=dt, status=200, code=\"API.00000\", message=\"OK\"\n        )\n    else:\n        error_info = API_ERRORS.get(error_key, API_ERRORS[\"server_error\"])\n        return StandardResult(\n            timestamp=dt, status=200, \n            code=error_info[\"code\"], message=error_info[\"message\"]\n        )\n\n@mock.get(\"/api/v1/users/{user_id}\")\ndef get_user_advanced(user_id: int):\n    if user_id <= 0:\n        return UserResponse(result=make_result(False, \"validation\"))\n    \n    if user_id > 1000:\n        return UserResponse(result=make_result(False, \"not_found\"))\n    \n    # Success response\n    user_data = {\"id\": user_id, \"name\": f\"User {user_id}\"}\n    return UserResponse(result=make_result(True), data=user_data)\n\nif __name__ == \"__main__\":\n    mock.run()\n```\n\n## \ud83c\udf10 Dashboard & Monitoring\n\nOnce your mock service is running, access:\n\n- **\ud83d\udcca Dashboard**: `http://localhost:8000` - Interactive monitoring interface\n- **\ud83d\udcc8 Metrics**: `http://localhost:8000/metrics` - Prometheus metrics endpoint\n- **\ud83d\udcda API Docs**: `http://localhost:8000/docs` - Auto-generated API documentation\n\n### Dashboard Features\n\n- **Real-time Logs**: View incoming requests and responses\n- **Metrics Visualization**: Charts for request counts, response times, and error rates\n- **Test Management**: Start/stop load testing sessions\n- **Endpoint Overview**: List of all registered mock endpoints\n- **Test Results History**: Historical test results with detailed summaries\n- **Independent Test Sessions**: Test sessions continue running even after closing the admin interface\n- **Separate Mock Control**: Activate/deactivate mock endpoints independently of test sessions\n- **Real-time Metrics**: Live updating charts and statistics\n\n## \ud83e\uddea Load Testing\n\nThe service includes built-in load testing capabilities with independent test session management:\n\n```python\n# Your mock service automatically includes testing endpoints\n# POST /api/start-test - Start a new test session\n# POST /api/stop-test - Stop test and generate summary\n# POST /api/reset-metrics - Reset all metrics\n# GET /api/test-session-status - Get current test session status\n# POST /api/stop-test-session - Stop test session by ID\n\n# Example: Start test via HTTP\nimport httpx\n\n# Start test in independent mode\nresponse = httpx.post(\"http://localhost:8000/api/start-test\",\n                     json={\"test_name\": \"Performance Test\", \"independent_mode\": True})\n\n# Get session ID from response\nsession_id = response.json()[\"test_session_id\"]\n\n# Your load testing tool hits the mock endpoints\n# ... run your load tests ...\n\n# Stop test and get results\nresults = httpx.post(\"http://localhost:8000/api/stop-test-session\",\n                    json={\"test_session_id\": session_id}).json()\nprint(results[\"summary\"])\n\n# Test continues even if admin interface is closed!\n```\n\n### CLI Test Management\n\nFor complete independence from the web interface, use the CLI commands:\n\n```bash\n# Start a test session\nfastapi-mock test start --name \"Load Test\" --host localhost --port 8000\n\n# Get test session status\nfastapi-mock test status --host localhost --port 8000\n\n# Stop the test session\nfastapi-mock test stop --session-id <session_id> --host localhost --port 8000\n\n# Force stop current test (no session ID needed)\nfastapi-mock test stop --force --host localhost --port 8000\n\n# Control mock endpoints independently\nfastapi-mock mock activate --host localhost --port 8000\nfastapi-mock mock deactivate --host localhost --port 8000\nfastapi-mock mock status --host localhost --port 8000\n\n# Start service with deactivated mocks\nfastapi-mock run my_mock.py --no-mocks\n```\n\n## \ud83d\udcca Metrics Collection\n\nAutomatic metrics collection includes:\n\n- **Request Count**: Total requests per endpoint\n- **Response Time**: Histogram of response times\n- **Status Codes**: Distribution of response codes\n- **Error Rates**: Success/failure ratios\n- **Custom Result Codes**: Application-specific result codes\n\n## \ud83d\udee0\ufe0f API Reference\n\n### MockService Class\n\n```python\nclass MockService:\n    def __init__(self, db_url: str = \"sqlite://test_results.db\"):\n        \"\"\"Initialize mock service with optional database URL\"\"\"\n    \n    def get(self, path: str = None, responses: list = None, tags: list = None):\n        \"\"\"GET endpoint decorator\"\"\"\n    \n    def post(self, path: str = None, responses: list = None, tags: list = None):\n        \"\"\"POST endpoint decorator\"\"\"\n    \n    def put(self, path: str = None, responses: list = None, tags: list = None):\n        \"\"\"PUT endpoint decorator\"\"\"\n    \n    def delete(self, path: str = None, responses: list = None, tags: list = None):\n        \"\"\"DELETE endpoint decorator\"\"\"\n    \n    def run(self, host: str = \"0.0.0.0\", port: int = 8000, **kwargs):\n        \"\"\"Run the mock service\"\"\"\n```\n\n### Decorator Parameters\n\n- **path**: URL path for the endpoint (defaults to function name)\n- **responses**: List of possible responses for documentation\n- **tags**: Tags for grouping endpoints in UI\n- **validation_error_handler**: Custom validation error handler\n\n## \ud83d\udcdd Examples\n\n### Example 1: REST API Mock\n\n```python\nfrom fastapi_mock_service import MockService\n\nmock = MockService()\n\n@mock.get(\"/api/products/{product_id}\")\ndef get_product(product_id: int, include_details: bool = False):\n    product = {\"id\": product_id, \"name\": f\"Product {product_id}\"}\n    if include_details:\n        product[\"description\"] = f\"Description for product {product_id}\"\n    return product\n\n@mock.get(\"/api/products\")\ndef list_products(category: str = \"all\", limit: int = 10):\n    return {\n        \"products\": [{\"id\": i, \"name\": f\"Product {i}\"} for i in range(1, limit + 1)],\n        \"category\": category,\n        \"total\": limit\n    }\n\nmock.run()\n```\n\n### Example 2: With Custom Headers\n\n```python\nfrom fastapi_mock_service import MockService\nfrom fastapi import Header\n\nmock = MockService()\n\n@mock.get(\"/api/secure/data\")\ndef get_secure_data(authorization: str = Header(...)):\n    if not authorization.startswith(\"Bearer \"):\n        return {\"error\": \"Invalid authorization header\"}\n    \n    return {\"data\": \"sensitive information\", \"user\": \"authenticated\"}\n\nmock.run()\n```\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## \ud83d\udcde Support\n\n- **Issues**: [GitLab Issues](https://gitlab.com/eastden4ik/fastapimockserver/-/issues)\n- **Email**: Sviridov.DS@bk.ru\n\n## Acknowledgments\nBuilt with:\n\n- FastAPI - Modern, fast web framework\n- Prometheus Client - Metrics collection\n- Tortoise ORM - Async ORM for test results\n- Chart.js - Interactive charts in dashboard\n\n\n------------------------------------------\n##### Made with \u2764\ufe0f for the API development and testing community\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Professional mock service library with load testing infrastructure for FastAPI",
    "version": "1.0.7",
    "project_urls": {
        "Bug Tracker": "https://gitlab.com/eastden4ik/fastapimockserver/-/issues",
        "Documentation": "https://gitlab.com/eastden4ik/fastapimockserver#readme",
        "Homepage": "https://gitlab.com/eastden4ik/fastapimockserver",
        "Repository": "https://gitlab.com/eastden4ik/fastapimockserver"
    },
    "split_keywords": [
        "fastapi",
        " mock",
        " testing",
        " api",
        " load-testing",
        " prometheus",
        " dashboard"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "04915557b058022c98878483775644afdfb28550846c296ff0a6ffa0df90fa9a",
                "md5": "bb58d1319c9634319014935ba9095070",
                "sha256": "c9ad7ea6e6c6e91230855309cbbf8fe4aac40a2c5f38003e032cc272b5b69346"
            },
            "downloads": -1,
            "filename": "fastapi_mock_service-1.0.7-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bb58d1319c9634319014935ba9095070",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 36677,
            "upload_time": "2025-10-17T09:49:06",
            "upload_time_iso_8601": "2025-10-17T09:49:06.526206Z",
            "url": "https://files.pythonhosted.org/packages/04/91/5557b058022c98878483775644afdfb28550846c296ff0a6ffa0df90fa9a/fastapi_mock_service-1.0.7-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "5a8cd13bfcd94817eda7ea1cabdc733aff2527ca5ce5f4fba4a6b2bfae024421",
                "md5": "cd627499889634e46cd33607a6ff5759",
                "sha256": "78ed354bd503107005043d7f873eb5013540dbdfc8691d2da09653f14caac1cf"
            },
            "downloads": -1,
            "filename": "fastapi_mock_service-1.0.7.tar.gz",
            "has_sig": false,
            "md5_digest": "cd627499889634e46cd33607a6ff5759",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 35864,
            "upload_time": "2025-10-17T09:49:07",
            "upload_time_iso_8601": "2025-10-17T09:49:07.530166Z",
            "url": "https://files.pythonhosted.org/packages/5a/8c/d13bfcd94817eda7ea1cabdc733aff2527ca5ce5f4fba4a6b2bfae024421/fastapi_mock_service-1.0.7.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-17 09:49:07",
    "github": false,
    "gitlab": true,
    "bitbucket": false,
    "codeberg": false,
    "gitlab_user": "eastden4ik",
    "gitlab_project": "fastapimockserver",
    "lcname": "fastapi-mock-service"
}
        
Elapsed time: 3.77086s