Name | nzrrest JSON |
Version |
0.1.0
JSON |
| download |
home_page | None |
Summary | Modern async Python framework for AI APIs with native Model Context Protocol (MCP) support |
upload_time | 2025-08-17 16:08:37 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.11 |
license | MIT |
keywords |
ai
api
async
framework
machine-learning
mcp
n8n
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# nzrRest Framework
<div align="center">
🤖 **Modern Async Python Framework for AI APIs with Native MCP Support**
[](https://badge.fury.io/py/nzrrest)
[](https://pypi.org/project/nzrrest/)
[](https://opensource.org/licenses/MIT)
[](https://github.com/nzrrest/nzrrest/actions)
[](https://codecov.io/gh/nzrrest/nzrrest)
[**Documentation**](https://nzrrest.readthedocs.io) | [**Examples**](examples/) | [**n8n Integration**](examples/n8n_integration/) | [**Contributing**](CONTRIBUTING.md)
</div>
---
## ✨ What is nzrRest?
**nzrRest** is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.
### 🎯 Key Features
- 🤖 **Native AI Model Integration** - First-class support for multiple AI providers and custom models
- 🔄 **Model Context Protocol (MCP)** - Built-in MCP implementation for seamless n8n integration
- ⚡ **High Performance** - Async/await throughout with ASGI compliance
- 📊 **Context Management** - Persistent conversation contexts with automatic cleanup
- 🛡️ **Production Ready** - Rate limiting, authentication, monitoring, and error handling
- 🗄️ **Database Integration** - SQLAlchemy async with automatic migrations
- 🎨 **DRF-Inspired Serializers** - Familiar, powerful data validation and transformation
- 🚀 **Auto-Generation** - CLI tools for rapid project scaffolding
- 🐳 **Cloud Native** - Docker support with production configurations
## 🚀 Quick Start
### Installation
```bash
pip install nzrrest
```
### Create Your First AI API
```bash
# Create a new project
nzrrest new my-ai-api
# Navigate to project
cd my-ai-api
# Run the development server
nzrrest run --reload
```
Your AI API is now running at `http://localhost:8000`! 🎉
### Hello World Example
```python
from nzrrest import NzrRestApp, Router
app = NzrRestApp(title="My AI API")
router = Router()
@router.post("/chat")
async def chat(request):
data = await request.json()
# Use built-in AI model
model = request.app.ai_registry.get_model("default")
result = await model.predict({"message": data["message"]})
return {"response": result["response"]}
app.include_router(router)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
```
## 🤖 AI Model Integration
nzrRest makes it incredibly easy to work with AI models:
```python
from nzrrest.ai.models import AIModel
class MyCustomModel(AIModel):
async def load_model(self):
# Load your model (PyTorch, HuggingFace, OpenAI, etc.)
self.model = load_my_model()
self.is_loaded = True
async def predict(self, payload, context=None):
# Make predictions with optional context
result = self.model.generate(payload["prompt"])
return {"response": result}
# Register and use
app.ai_registry.register_model_class("custom", MyCustomModel)
await app.ai_registry.add_model("my_model", "custom", config={...})
```
### Supported AI Providers
- ✅ **OpenAI** (GPT-3.5, GPT-4, etc.)
- ✅ **Anthropic** (Claude models)
- ✅ **HuggingFace** (Transformers, Inference API)
- ✅ **Custom Models** (PyTorch, TensorFlow, etc.)
- ✅ **Mock Models** (for development and testing)
## 🔄 Model Context Protocol (MCP)
nzrRest implements the Model Context Protocol for stateful AI interactions:
```python
# MCP-compliant endpoint
@router.post("/mcp/{model_name}/predict")
async def mcp_predict(request, model_name: str):
# Automatic context management
mcp_request = MCPRequest(**(await request.json()))
# Retrieve conversation context
context = await get_context(mcp_request.context_id)
# Make prediction with context
model = request.app.ai_registry.get_model(model_name)
result = await model.predict(mcp_request.payload, context)
# Return MCP-compliant response
return MCPResponse(
request_id=mcp_request.request_id,
context_id=mcp_request.context_id,
result=result
)
```
## 🎨 Powerful Serializers
nzrRest provides robust data validation:
```python
from nzrrest.serializers import BaseSerializer, CharField, IntegerField
class ChatRequestSerializer(BaseSerializer):
message = CharField(max_length=1000)
user_id = CharField(required=False)
temperature = FloatField(min_value=0.0, max_value=2.0, default=0.7)
def validate(self, data):
# Custom validation logic
return data
# Use in endpoints
@router.post("/chat")
async def chat(request):
data = await request.json()
serializer = ChatRequestSerializer(data=data)
if serializer.is_valid():
validated_data = serializer.validated_data
# Process with confidence...
else:
return JSONResponse(serializer.errors, status_code=422)
```
## 🗄️ Database Integration
Built-in async database support with SQLAlchemy:
```python
from nzrrest.db import Base
from sqlalchemy import Column, Integer, String, DateTime
class ConversationHistory(Base):
__tablename__ = "conversations"
id = Column(Integer, primary_key=True)
user_id = Column(String(255), index=True)
message = Column(Text)
response = Column(Text)
created_at = Column(DateTime, default=datetime.utcnow)
# Use in endpoints
@router.post("/chat")
async def chat(request):
async with request.app.get_db_session() as session:
# Save conversation
conversation = ConversationHistory(
user_id=user_id,
message=message,
response=response
)
session.add(conversation)
await session.commit()
```
## 🛡️ Production Features
### Rate Limiting
```python
from nzrrest.middleware import RateLimitMiddleware
app.add_middleware(
RateLimitMiddleware,
calls_per_minute=60,
calls_per_hour=1000
)
```
### Authentication
```python
from nzrrest.middleware import AuthenticationMiddleware
app.add_middleware(
AuthenticationMiddleware,
secret_key="your-secret-key"
)
```
### CORS for n8n
```python
from starlette.middleware.cors import CORSMiddleware
app.add_middleware(
CORSMiddleware,
allow_origins=["https://app.n8n.cloud"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"]
)
```
## 🔧 CLI Tools
nzrRest includes powerful CLI tools for development:
```bash
# Create new project
nzrrest new my-project --template mcp-server
# Run development server
nzrrest run --reload --port 8000
# Database migrations
nzrrest migrate -m "Add user table"
nzrrest migrate --upgrade
# Model management
nzrrest models --list
nzrrest models --add openai_gpt4 --type openai
# Project info
nzrrest info
```
## 🌐 n8n Integration
Perfect for n8n workflows with built-in MCP support:
```json
{
"nodes": [{
"name": "AI Chat",
"type": "n8n-nodes-base.httpRequest",
"parameters": {
"url": "http://your-api.com/api/v1/mcp/gpt4/predict",
"method": "POST",
"body": {
"context_id": "{{ $json.session_id }}",
"payload": {
"message": "{{ $json.user_input }}"
}
}
}
}]
}
```
## 📊 Monitoring & Observability
Built-in monitoring capabilities:
```python
# Health checks
GET /health
GET /api/v1/models/{name}/health
# Metrics
GET /metrics
GET /api/v1/stats
# Usage analytics
GET /api/v1/usage/models
GET /api/v1/conversations/{context_id}
```
## 🐳 Docker Deployment
Production-ready containers:
```dockerfile
FROM python:3.11-slim
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
```
```bash
# Build and run
docker build -t my-ai-api .
docker run -p 8000:8000 my-ai-api
# Or use docker-compose
docker-compose up -d
```
## 📚 Examples
Check out our comprehensive examples:
- [**Basic API**](examples/basic_api.py) - Simple AI API with chat functionality
- [**Advanced Chatbot**](examples/ai_chatbot.py) - Full-featured chatbot with personality
- [**n8n Integration**](examples/n8n_integration/) - Complete n8n workflow examples
- [**Custom Models**](examples/custom_models/) - Implementing your own AI models
## 📖 Documentation
- [**Quick Start Guide**](https://nzrrest.readthedocs.io/quickstart/)
- [**API Reference**](https://nzrrest.readthedocs.io/api/)
- [**AI Model Integration**](https://nzrrest.readthedocs.io/models/)
- [**MCP Specification**](https://nzrrest.readthedocs.io/mcp/)
- [**Deployment Guide**](https://nzrrest.readthedocs.io/deployment/)
## 🤝 Contributing
We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.
```bash
# Development setup
git clone https://github.com/nzrrest/nzrrest.git
cd nzrrest
pip install -e ".[dev]"
# Run tests
pytest
# Run linting
black .
isort .
flake8
```
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 🙏 Acknowledgments
- Built on the excellent FastAPI and Starlette foundations
- Designed for seamless n8n integration
- Community-driven development
## 🔗 Links
- **Homepage**: [https://nzrrest.dev](https://nzrrest.dev)
- **Documentation**: [https://nzrrest.readthedocs.io](https://nzrrest.readthedocs.io)
- **PyPI**: [https://pypi.org/project/nzrrest/](https://pypi.org/project/nzrrest/)
- **GitHub**: [https://github.com/nzrrest/nzrrest](https://github.com/nzrrest/nzrrest)
- **Discord**: [https://discord.gg/nzrrest](https://discord.gg/nzrrest)
---
<div align="center">
**Built with ❤️ for the AI community**
*nzrRest Framework - Making AI APIs Simple and Powerful*
</div>
Raw data
{
"_id": null,
"home_page": null,
"name": "nzrrest",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "ai, api, async, framework, machine-learning, mcp, n8n",
"author": null,
"author_email": "nzrRest Team <alairjt@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/25/54/8a3007970d06ecbe3215cff278a4936707719ac52996747188b7ca976246/nzrrest-0.1.0.tar.gz",
"platform": null,
"description": "# nzrRest Framework\n\n<div align=\"center\">\n\n\ud83e\udd16 **Modern Async Python Framework for AI APIs with Native MCP Support**\n\n[](https://badge.fury.io/py/nzrrest)\n[](https://pypi.org/project/nzrrest/)\n[](https://opensource.org/licenses/MIT)\n[](https://github.com/nzrrest/nzrrest/actions)\n[](https://codecov.io/gh/nzrrest/nzrrest)\n\n[**Documentation**](https://nzrrest.readthedocs.io) | [**Examples**](examples/) | [**n8n Integration**](examples/n8n_integration/) | [**Contributing**](CONTRIBUTING.md)\n\n</div>\n\n---\n\n## \u2728 What is nzrRest?\n\n**nzrRest** is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.\n\n### \ud83c\udfaf Key Features\n\n- \ud83e\udd16 **Native AI Model Integration** - First-class support for multiple AI providers and custom models\n- \ud83d\udd04 **Model Context Protocol (MCP)** - Built-in MCP implementation for seamless n8n integration\n- \u26a1 **High Performance** - Async/await throughout with ASGI compliance \n- \ud83d\udcca **Context Management** - Persistent conversation contexts with automatic cleanup\n- \ud83d\udee1\ufe0f **Production Ready** - Rate limiting, authentication, monitoring, and error handling\n- \ud83d\uddc4\ufe0f **Database Integration** - SQLAlchemy async with automatic migrations\n- \ud83c\udfa8 **DRF-Inspired Serializers** - Familiar, powerful data validation and transformation\n- \ud83d\ude80 **Auto-Generation** - CLI tools for rapid project scaffolding\n- \ud83d\udc33 **Cloud Native** - Docker support with production configurations\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\n```bash\npip install nzrrest\n```\n\n### Create Your First AI API\n\n```bash\n# Create a new project\nnzrrest new my-ai-api\n\n# Navigate to project\ncd my-ai-api\n\n# Run the development server\nnzrrest run --reload\n```\n\nYour AI API is now running at `http://localhost:8000`! \ud83c\udf89\n\n### Hello World Example\n\n```python\nfrom nzrrest import NzrRestApp, Router\n\napp = NzrRestApp(title=\"My AI API\")\nrouter = Router()\n\n@router.post(\"/chat\")\nasync def chat(request):\n data = await request.json()\n \n # Use built-in AI model\n model = request.app.ai_registry.get_model(\"default\")\n result = await model.predict({\"message\": data[\"message\"]})\n \n return {\"response\": result[\"response\"]}\n\napp.include_router(router)\n\nif __name__ == \"__main__\":\n import uvicorn\n uvicorn.run(app, host=\"0.0.0.0\", port=8000)\n```\n\n## \ud83e\udd16 AI Model Integration\n\nnzrRest makes it incredibly easy to work with AI models:\n\n```python\nfrom nzrrest.ai.models import AIModel\n\nclass MyCustomModel(AIModel):\n async def load_model(self):\n # Load your model (PyTorch, HuggingFace, OpenAI, etc.)\n self.model = load_my_model()\n self.is_loaded = True\n \n async def predict(self, payload, context=None):\n # Make predictions with optional context\n result = self.model.generate(payload[\"prompt\"])\n return {\"response\": result}\n\n# Register and use\napp.ai_registry.register_model_class(\"custom\", MyCustomModel)\nawait app.ai_registry.add_model(\"my_model\", \"custom\", config={...})\n```\n\n### Supported AI Providers\n\n- \u2705 **OpenAI** (GPT-3.5, GPT-4, etc.)\n- \u2705 **Anthropic** (Claude models)\n- \u2705 **HuggingFace** (Transformers, Inference API)\n- \u2705 **Custom Models** (PyTorch, TensorFlow, etc.)\n- \u2705 **Mock Models** (for development and testing)\n\n## \ud83d\udd04 Model Context Protocol (MCP)\n\nnzrRest implements the Model Context Protocol for stateful AI interactions:\n\n```python\n# MCP-compliant endpoint\n@router.post(\"/mcp/{model_name}/predict\")\nasync def mcp_predict(request, model_name: str):\n # Automatic context management\n mcp_request = MCPRequest(**(await request.json()))\n \n # Retrieve conversation context\n context = await get_context(mcp_request.context_id)\n \n # Make prediction with context\n model = request.app.ai_registry.get_model(model_name)\n result = await model.predict(mcp_request.payload, context)\n \n # Return MCP-compliant response\n return MCPResponse(\n request_id=mcp_request.request_id,\n context_id=mcp_request.context_id,\n result=result\n )\n```\n\n## \ud83c\udfa8 Powerful Serializers\n\nnzrRest provides robust data validation:\n\n```python\nfrom nzrrest.serializers import BaseSerializer, CharField, IntegerField\n\nclass ChatRequestSerializer(BaseSerializer):\n message = CharField(max_length=1000)\n user_id = CharField(required=False)\n temperature = FloatField(min_value=0.0, max_value=2.0, default=0.7)\n \n def validate(self, data):\n # Custom validation logic\n return data\n\n# Use in endpoints\n@router.post(\"/chat\")\nasync def chat(request):\n data = await request.json()\n serializer = ChatRequestSerializer(data=data)\n \n if serializer.is_valid():\n validated_data = serializer.validated_data\n # Process with confidence...\n else:\n return JSONResponse(serializer.errors, status_code=422)\n```\n\n## \ud83d\uddc4\ufe0f Database Integration\n\nBuilt-in async database support with SQLAlchemy:\n\n```python\nfrom nzrrest.db import Base\nfrom sqlalchemy import Column, Integer, String, DateTime\n\nclass ConversationHistory(Base):\n __tablename__ = \"conversations\"\n \n id = Column(Integer, primary_key=True)\n user_id = Column(String(255), index=True)\n message = Column(Text)\n response = Column(Text)\n created_at = Column(DateTime, default=datetime.utcnow)\n\n# Use in endpoints\n@router.post(\"/chat\")\nasync def chat(request):\n async with request.app.get_db_session() as session:\n # Save conversation\n conversation = ConversationHistory(\n user_id=user_id,\n message=message,\n response=response\n )\n session.add(conversation)\n await session.commit()\n```\n\n## \ud83d\udee1\ufe0f Production Features\n\n### Rate Limiting\n```python\nfrom nzrrest.middleware import RateLimitMiddleware\n\napp.add_middleware(\n RateLimitMiddleware,\n calls_per_minute=60,\n calls_per_hour=1000\n)\n```\n\n### Authentication\n```python\nfrom nzrrest.middleware import AuthenticationMiddleware\n\napp.add_middleware(\n AuthenticationMiddleware,\n secret_key=\"your-secret-key\"\n)\n```\n\n### CORS for n8n\n```python\nfrom starlette.middleware.cors import CORSMiddleware\n\napp.add_middleware(\n CORSMiddleware,\n allow_origins=[\"https://app.n8n.cloud\"],\n allow_credentials=True,\n allow_methods=[\"*\"],\n allow_headers=[\"*\"]\n)\n```\n\n## \ud83d\udd27 CLI Tools\n\nnzrRest includes powerful CLI tools for development:\n\n```bash\n# Create new project\nnzrrest new my-project --template mcp-server\n\n# Run development server \nnzrrest run --reload --port 8000\n\n# Database migrations\nnzrrest migrate -m \"Add user table\"\nnzrrest migrate --upgrade\n\n# Model management\nnzrrest models --list\nnzrrest models --add openai_gpt4 --type openai\n\n# Project info\nnzrrest info\n```\n\n## \ud83c\udf10 n8n Integration\n\nPerfect for n8n workflows with built-in MCP support:\n\n```json\n{\n \"nodes\": [{\n \"name\": \"AI Chat\",\n \"type\": \"n8n-nodes-base.httpRequest\",\n \"parameters\": {\n \"url\": \"http://your-api.com/api/v1/mcp/gpt4/predict\",\n \"method\": \"POST\",\n \"body\": {\n \"context_id\": \"{{ $json.session_id }}\",\n \"payload\": {\n \"message\": \"{{ $json.user_input }}\"\n }\n }\n }\n }]\n}\n```\n\n## \ud83d\udcca Monitoring & Observability\n\nBuilt-in monitoring capabilities:\n\n```python\n# Health checks\nGET /health\nGET /api/v1/models/{name}/health\n\n# Metrics\nGET /metrics\nGET /api/v1/stats\n\n# Usage analytics\nGET /api/v1/usage/models\nGET /api/v1/conversations/{context_id}\n```\n\n## \ud83d\udc33 Docker Deployment\n\nProduction-ready containers:\n\n```dockerfile\nFROM python:3.11-slim\nCOPY requirements.txt .\nRUN pip install -r requirements.txt\nCOPY . .\nEXPOSE 8000\nCMD [\"uvicorn\", \"main:app\", \"--host\", \"0.0.0.0\", \"--port\", \"8000\"]\n```\n\n```bash\n# Build and run\ndocker build -t my-ai-api .\ndocker run -p 8000:8000 my-ai-api\n\n# Or use docker-compose\ndocker-compose up -d\n```\n\n## \ud83d\udcda Examples\n\nCheck out our comprehensive examples:\n\n- [**Basic API**](examples/basic_api.py) - Simple AI API with chat functionality\n- [**Advanced Chatbot**](examples/ai_chatbot.py) - Full-featured chatbot with personality\n- [**n8n Integration**](examples/n8n_integration/) - Complete n8n workflow examples\n- [**Custom Models**](examples/custom_models/) - Implementing your own AI models\n\n## \ud83d\udcd6 Documentation\n\n- [**Quick Start Guide**](https://nzrrest.readthedocs.io/quickstart/)\n- [**API Reference**](https://nzrrest.readthedocs.io/api/)\n- [**AI Model Integration**](https://nzrrest.readthedocs.io/models/)\n- [**MCP Specification**](https://nzrrest.readthedocs.io/mcp/)\n- [**Deployment Guide**](https://nzrrest.readthedocs.io/deployment/)\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.\n\n```bash\n# Development setup\ngit clone https://github.com/nzrrest/nzrrest.git\ncd nzrrest\npip install -e \".[dev]\"\n\n# Run tests\npytest\n\n# Run linting\nblack .\nisort .\nflake8\n```\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## \ud83d\ude4f Acknowledgments\n\n- Built on the excellent FastAPI and Starlette foundations \n- Designed for seamless n8n integration\n- Community-driven development\n\n## \ud83d\udd17 Links\n\n- **Homepage**: [https://nzrrest.dev](https://nzrrest.dev)\n- **Documentation**: [https://nzrrest.readthedocs.io](https://nzrrest.readthedocs.io)\n- **PyPI**: [https://pypi.org/project/nzrrest/](https://pypi.org/project/nzrrest/)\n- **GitHub**: [https://github.com/nzrrest/nzrrest](https://github.com/nzrrest/nzrrest)\n- **Discord**: [https://discord.gg/nzrrest](https://discord.gg/nzrrest)\n\n---\n\n<div align=\"center\">\n\n**Built with \u2764\ufe0f for the AI community**\n\n*nzrRest Framework - Making AI APIs Simple and Powerful*\n\n</div>",
"bugtrack_url": null,
"license": "MIT",
"summary": "Modern async Python framework for AI APIs with native Model Context Protocol (MCP) support",
"version": "0.1.0",
"project_urls": {
"Bug Tracker": "https://github.com/nzrrest/nzrrest/issues",
"Changelog": "https://github.com/nzrrest/nzrrest/blob/main/CHANGELOG.md",
"Documentation": "https://nzrrest.readthedocs.io",
"Homepage": "https://github.com/nzrrest/nzrrest",
"Repository": "https://github.com/nzrrest/nzrrest"
},
"split_keywords": [
"ai",
" api",
" async",
" framework",
" machine-learning",
" mcp",
" n8n"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "68529570a3e91c76ae666b3e54f783e250a5089eaa97dbfc0f554decddabfdbe",
"md5": "232174a6f5d2068f4b34e71e97180616",
"sha256": "737100f65a9cb3c385027cee10b0f3fa02003b8489cca22cba08381511c9960d"
},
"downloads": -1,
"filename": "nzrrest-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "232174a6f5d2068f4b34e71e97180616",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 42041,
"upload_time": "2025-08-17T16:08:36",
"upload_time_iso_8601": "2025-08-17T16:08:36.226600Z",
"url": "https://files.pythonhosted.org/packages/68/52/9570a3e91c76ae666b3e54f783e250a5089eaa97dbfc0f554decddabfdbe/nzrrest-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "25548a3007970d06ecbe3215cff278a4936707719ac52996747188b7ca976246",
"md5": "58684639e6af9ea49f85e78700d04fa9",
"sha256": "25e982fd74b753c73bf52b57f4fa8eb572edc8a74bafc8686bbb9af21d5e20fa"
},
"downloads": -1,
"filename": "nzrrest-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "58684639e6af9ea49f85e78700d04fa9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 62779,
"upload_time": "2025-08-17T16:08:37",
"upload_time_iso_8601": "2025-08-17T16:08:37.723506Z",
"url": "https://files.pythonhosted.org/packages/25/54/8a3007970d06ecbe3215cff278a4936707719ac52996747188b7ca976246/nzrrest-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-17 16:08:37",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "nzrrest",
"github_project": "nzrrest",
"github_not_found": true,
"lcname": "nzrrest"
}