Name | nzrapi JSON |
Version |
0.2.1
JSON |
| download |
home_page | None |
Summary | Modern async Python framework for AI APIs with native Model Context Protocol (MCP) support |
upload_time | 2025-08-18 18:27:22 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.11 |
license | MIT |
keywords |
ai
api
async
framework
machine-learning
mcp
n8n
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# nzrApi Framework
<div align="center">
🤖 **Modern Async Python Framework for AI APIs with Native MCP Support**
[](https://badge.fury.io/py/nzrapi)
[](https://pypi.org/project/nzrapi/)
[](https://opensource.org/licenses/MIT)
[](https://github.com/nzrapi/nzrapi/actions)
[](https://codecov.io/gh/nzrapi/nzrapi)
[**Documentation**](https://nzrapi.readthedocs.io) | [**Examples**](examples/) | [**n8n Integration**](examples/n8n_integration/) | [**Contributing**](CONTRIBUTING.md)
</div>
---
## ✨ What is nzrApi?
**nzrApi** is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.
### 🎯 Key Features
- 🤖 **Native AI Model Integration** - First-class support for multiple AI providers and custom models
- 🔄 **Model Context Protocol (MCP)** - Built-in MCP implementation for seamless n8n integration
- ⚡ **High Performance** - Async/await throughout with ASGI compliance
- 📊 **Context Management** - Persistent conversation contexts with automatic cleanup
- 🛡️ **Production Ready** - Rate limiting, authentication, monitoring, and error handling
- 🗄️ **Database Integration** - SQLAlchemy async with automatic migrations
- 🎨 **DRF-Inspired Serializers** - Familiar, powerful data validation and transformation
- 🚀 **Auto-Generation** - CLI tools for rapid project scaffolding
- 🐳 **Cloud Native** - Docker support with production configurations
## 🚀 Quick Start
### Installation
```bash
pip install nzrapi
```
### Create Your First AI API
```bash
# Create a new project
nzrapi new my-ai-api
# Navigate to project
cd my-ai-api
# Run the development server
nzrapi run --reload
```
Your AI API is now running at `http://localhost:8000`! 🎉
### Hello World Example
```python
from nzrapi import NzrApiApp, Router
app = NzrApiApp(title="My AI API")
router = Router()
@router.post("/chat")
async def chat(request):
data = await request.json()
# Use built-in AI model
model = request.app.ai_registry.get_model("default")
result = await model.predict({"message": data["message"]})
return {"response": result["response"]}
app.include_router(router)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
```
## 🤖 AI Model Integration
nzrApi makes it incredibly easy to work with AI models:
```python
from nzrapi.ai.models import AIModel
class MyCustomModel(AIModel):
async def load_model(self):
# Load your model (PyTorch, HuggingFace, OpenAI, etc.)
self.model = load_my_model()
self.is_loaded = True
async def predict(self, payload, context=None):
# Make predictions with optional context
result = self.model.generate(payload["prompt"])
return {"response": result}
# Register and use
app.ai_registry.register_model_class("custom", MyCustomModel)
await app.ai_registry.add_model("my_model", "custom", config={...})
```
### Supported AI Providers
- ✅ **OpenAI** (GPT-3.5, GPT-4, etc.)
- ✅ **Anthropic** (Claude models)
- ✅ **HuggingFace** (Transformers, Inference API)
- ✅ **Custom Models** (PyTorch, TensorFlow, etc.)
- ✅ **Mock Models** (for development and testing)
## 🔄 Model Context Protocol (MCP)
nzrApi implements the Model Context Protocol for stateful AI interactions:
```python
# MCP-compliant endpoint
@router.post("/mcp/{model_name}/predict")
async def mcp_predict(request, model_name: str):
# Automatic context management
mcp_request = MCPRequest(**(await request.json()))
# Retrieve conversation context
context = await get_context(mcp_request.context_id)
# Make prediction with context
model = request.app.ai_registry.get_model(model_name)
result = await model.predict(mcp_request.payload, context)
# Return MCP-compliant response
return MCPResponse(
request_id=mcp_request.request_id,
context_id=mcp_request.context_id,
result=result
)
```
## 🎨 Powerful Serializers
nzrApi provides robust data validation:
```python
from nzrapi.serializers import BaseSerializer, CharField, IntegerField
class ChatRequestSerializer(BaseSerializer):
message = CharField(max_length=1000)
user_id = CharField(required=False)
temperature = FloatField(min_value=0.0, max_value=2.0, default=0.7)
def validate(self, data):
# Custom validation logic
return data
# Use in endpoints
@router.post("/chat")
async def chat(request):
data = await request.json()
serializer = ChatRequestSerializer(data=data)
if serializer.is_valid():
validated_data = serializer.validated_data
# Process with confidence...
else:
return JSONResponse(serializer.errors, status_code=422)
```
## 🗄️ Database Integration
Built-in async database support with SQLAlchemy:
```python
from nzrapi.db import Base
from sqlalchemy import Column, Integer, String, DateTime
class ConversationHistory(Base):
__tablename__ = "conversations"
id = Column(Integer, primary_key=True)
user_id = Column(String(255), index=True)
message = Column(Text)
response = Column(Text)
created_at = Column(DateTime, default=datetime.utcnow)
# Use in endpoints
@router.post("/chat")
async def chat(request):
async with request.app.get_db_session() as session:
# Save conversation
conversation = ConversationHistory(
user_id=user_id,
message=message,
response=response
)
session.add(conversation)
await session.commit()
```
## 🛡️ Production Features
### Rate Limiting
```python
from nzrapi.middleware import RateLimitMiddleware
app.add_middleware(
RateLimitMiddleware,
calls_per_minute=60,
calls_per_hour=1000
)
```
### Authentication
```python
from nzrapi.middleware import AuthenticationMiddleware
app.add_middleware(
AuthenticationMiddleware,
secret_key="your-secret-key"
)
```
### CORS for n8n
```python
from starlette.middleware.cors import CORSMiddleware
app.add_middleware(
CORSMiddleware,
allow_origins=["https://app.n8n.cloud"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"]
)
```
## 🔧 CLI Tools
nzrApi includes powerful CLI tools for development:
```bash
# Create new project
nzrapi new my-project --template mcp-server
# Run development server
nzrapi run --reload --port 8000
# Database migrations
nzrapi migrate -m "Add user table"
nzrapi migrate --upgrade
# Model management
nzrapi models --list
nzrapi models --add openai_gpt4 --type openai
# Project info
nzrapi info
```
## 🌐 n8n Integration
Perfect for n8n workflows with built-in MCP support:
```json
{
"nodes": [{
"name": "AI Chat",
"type": "n8n-nodes-base.httpRequest",
"parameters": {
"url": "http://your-api.com/api/v1/mcp/gpt4/predict",
"method": "POST",
"body": {
"context_id": "{{ $json.session_id }}",
"payload": {
"message": "{{ $json.user_input }}"
}
}
}
}]
}
```
## 📊 Monitoring & Observability
Built-in monitoring capabilities:
```python
# Health checks
GET /health
GET /api/v1/models/{name}/health
# Metrics
GET /metrics
GET /api/v1/stats
# Usage analytics
GET /api/v1/usage/models
GET /api/v1/conversations/{context_id}
```
## 🐳 Docker Deployment
Production-ready containers:
```dockerfile
FROM python:3.11-slim
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
```
```bash
# Build and run
docker build -t my-ai-api .
docker run -p 8000:8000 my-ai-api
# Or use docker-compose
docker-compose up -d
```
## 📚 Examples
Check out our comprehensive examples:
- [**Basic API**](examples/basic_api.py) - Simple AI API with chat functionality
- [**Advanced Chatbot**](examples/ai_chatbot.py) - Full-featured chatbot with personality
- [**n8n Integration**](examples/n8n_integration/) - Complete n8n workflow examples
- [**Custom Models**](examples/custom_models/) - Implementing your own AI models
## 📖 Documentation
- [**Quick Start Guide**](https://nzrapi.readthedocs.io/quickstart/)
- [**API Reference**](https://nzrapi.readthedocs.io/api/)
- [**AI Model Integration**](https://nzrapi.readthedocs.io/models/)
- [**MCP Specification**](https://nzrapi.readthedocs.io/mcp/)
- [**Deployment Guide**](https://nzrapi.readthedocs.io/deployment/)
## 🤝 Contributing
We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.
```bash
# Development setup
git clone https://github.com/nzrapi/nzrapi.git
cd nzrapi
pip install -e ".[dev]"
# Run tests
pytest
# Run linting
black .
isort .
flake8
```
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 🙏 Acknowledgments
- Built on the excellent FastAPI and Starlette foundations
- Designed for seamless n8n integration
- Community-driven development
## 🔗 Links
- **Homepage**: [https://nzrapi.dev](https://nzrapi.dev)
- **Documentation**: [https://nzrapi.readthedocs.io](https://nzrapi.readthedocs.io)
- **PyPI**: [https://pypi.org/project/nzrapi/](https://pypi.org/project/nzrapi/)
- **GitHub**: [https://github.com/nzrapi/nzrapi](https://github.com/nzrapi/nzrapi)
- **Discord**: [https://discord.gg/nzrapi](https://discord.gg/nzrapi)
---
<div align="center">
**Built with ❤️ for the AI community**
*nzrApi Framework - Making AI APIs Simple and Powerful*
</div>
Raw data
{
"_id": null,
"home_page": null,
"name": "nzrapi",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "ai, api, async, framework, machine-learning, mcp, n8n",
"author": null,
"author_email": "NzrApi Team <alairjt@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/2b/d0/348484d3661a58af7f922d9f2c5d51cc7b97d6e458902f2d6a664c7e42ba/nzrapi-0.2.1.tar.gz",
"platform": null,
"description": "# nzrApi Framework\n\n<div align=\"center\">\n\n\ud83e\udd16 **Modern Async Python Framework for AI APIs with Native MCP Support**\n\n[](https://badge.fury.io/py/nzrapi)\n[](https://pypi.org/project/nzrapi/)\n[](https://opensource.org/licenses/MIT)\n[](https://github.com/nzrapi/nzrapi/actions)\n[](https://codecov.io/gh/nzrapi/nzrapi)\n\n[**Documentation**](https://nzrapi.readthedocs.io) | [**Examples**](examples/) | [**n8n Integration**](examples/n8n_integration/) | [**Contributing**](CONTRIBUTING.md)\n\n</div>\n\n---\n\n## \u2728 What is nzrApi?\n\n**nzrApi** is a powerful, production-ready Python framework specifically designed for building AI-powered APIs. It combines the best of modern web frameworks with specialized features for AI model integration, making it the perfect choice for developers who want to build scalable AI services with minimal complexity.\n\n### \ud83c\udfaf Key Features\n\n- \ud83e\udd16 **Native AI Model Integration** - First-class support for multiple AI providers and custom models\n- \ud83d\udd04 **Model Context Protocol (MCP)** - Built-in MCP implementation for seamless n8n integration\n- \u26a1 **High Performance** - Async/await throughout with ASGI compliance \n- \ud83d\udcca **Context Management** - Persistent conversation contexts with automatic cleanup\n- \ud83d\udee1\ufe0f **Production Ready** - Rate limiting, authentication, monitoring, and error handling\n- \ud83d\uddc4\ufe0f **Database Integration** - SQLAlchemy async with automatic migrations\n- \ud83c\udfa8 **DRF-Inspired Serializers** - Familiar, powerful data validation and transformation\n- \ud83d\ude80 **Auto-Generation** - CLI tools for rapid project scaffolding\n- \ud83d\udc33 **Cloud Native** - Docker support with production configurations\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\n```bash\npip install nzrapi\n```\n\n### Create Your First AI API\n\n```bash\n# Create a new project\nnzrapi new my-ai-api\n\n# Navigate to project\ncd my-ai-api\n\n# Run the development server\nnzrapi run --reload\n```\n\nYour AI API is now running at `http://localhost:8000`! \ud83c\udf89\n\n### Hello World Example\n\n```python\nfrom nzrapi import NzrApiApp, Router\n\napp = NzrApiApp(title=\"My AI API\")\nrouter = Router()\n\n@router.post(\"/chat\")\nasync def chat(request):\n data = await request.json()\n \n # Use built-in AI model\n model = request.app.ai_registry.get_model(\"default\")\n result = await model.predict({\"message\": data[\"message\"]})\n \n return {\"response\": result[\"response\"]}\n\napp.include_router(router)\n\nif __name__ == \"__main__\":\n import uvicorn\n uvicorn.run(app, host=\"0.0.0.0\", port=8000)\n```\n\n## \ud83e\udd16 AI Model Integration\n\nnzrApi makes it incredibly easy to work with AI models:\n\n```python\nfrom nzrapi.ai.models import AIModel\n\nclass MyCustomModel(AIModel):\n async def load_model(self):\n # Load your model (PyTorch, HuggingFace, OpenAI, etc.)\n self.model = load_my_model()\n self.is_loaded = True\n \n async def predict(self, payload, context=None):\n # Make predictions with optional context\n result = self.model.generate(payload[\"prompt\"])\n return {\"response\": result}\n\n# Register and use\napp.ai_registry.register_model_class(\"custom\", MyCustomModel)\nawait app.ai_registry.add_model(\"my_model\", \"custom\", config={...})\n```\n\n### Supported AI Providers\n\n- \u2705 **OpenAI** (GPT-3.5, GPT-4, etc.)\n- \u2705 **Anthropic** (Claude models)\n- \u2705 **HuggingFace** (Transformers, Inference API)\n- \u2705 **Custom Models** (PyTorch, TensorFlow, etc.)\n- \u2705 **Mock Models** (for development and testing)\n\n## \ud83d\udd04 Model Context Protocol (MCP)\n\nnzrApi implements the Model Context Protocol for stateful AI interactions:\n\n```python\n# MCP-compliant endpoint\n@router.post(\"/mcp/{model_name}/predict\")\nasync def mcp_predict(request, model_name: str):\n # Automatic context management\n mcp_request = MCPRequest(**(await request.json()))\n \n # Retrieve conversation context\n context = await get_context(mcp_request.context_id)\n \n # Make prediction with context\n model = request.app.ai_registry.get_model(model_name)\n result = await model.predict(mcp_request.payload, context)\n \n # Return MCP-compliant response\n return MCPResponse(\n request_id=mcp_request.request_id,\n context_id=mcp_request.context_id,\n result=result\n )\n```\n\n## \ud83c\udfa8 Powerful Serializers\n\nnzrApi provides robust data validation:\n\n```python\nfrom nzrapi.serializers import BaseSerializer, CharField, IntegerField\n\nclass ChatRequestSerializer(BaseSerializer):\n message = CharField(max_length=1000)\n user_id = CharField(required=False)\n temperature = FloatField(min_value=0.0, max_value=2.0, default=0.7)\n \n def validate(self, data):\n # Custom validation logic\n return data\n\n# Use in endpoints\n@router.post(\"/chat\")\nasync def chat(request):\n data = await request.json()\n serializer = ChatRequestSerializer(data=data)\n \n if serializer.is_valid():\n validated_data = serializer.validated_data\n # Process with confidence...\n else:\n return JSONResponse(serializer.errors, status_code=422)\n```\n\n## \ud83d\uddc4\ufe0f Database Integration\n\nBuilt-in async database support with SQLAlchemy:\n\n```python\nfrom nzrapi.db import Base\nfrom sqlalchemy import Column, Integer, String, DateTime\n\nclass ConversationHistory(Base):\n __tablename__ = \"conversations\"\n \n id = Column(Integer, primary_key=True)\n user_id = Column(String(255), index=True)\n message = Column(Text)\n response = Column(Text)\n created_at = Column(DateTime, default=datetime.utcnow)\n\n# Use in endpoints\n@router.post(\"/chat\")\nasync def chat(request):\n async with request.app.get_db_session() as session:\n # Save conversation\n conversation = ConversationHistory(\n user_id=user_id,\n message=message,\n response=response\n )\n session.add(conversation)\n await session.commit()\n```\n\n## \ud83d\udee1\ufe0f Production Features\n\n### Rate Limiting\n```python\nfrom nzrapi.middleware import RateLimitMiddleware\n\napp.add_middleware(\n RateLimitMiddleware,\n calls_per_minute=60,\n calls_per_hour=1000\n)\n```\n\n### Authentication\n```python\nfrom nzrapi.middleware import AuthenticationMiddleware\n\napp.add_middleware(\n AuthenticationMiddleware,\n secret_key=\"your-secret-key\"\n)\n```\n\n### CORS for n8n\n```python\nfrom starlette.middleware.cors import CORSMiddleware\n\napp.add_middleware(\n CORSMiddleware,\n allow_origins=[\"https://app.n8n.cloud\"],\n allow_credentials=True,\n allow_methods=[\"*\"],\n allow_headers=[\"*\"]\n)\n```\n\n## \ud83d\udd27 CLI Tools\n\nnzrApi includes powerful CLI tools for development:\n\n```bash\n# Create new project\nnzrapi new my-project --template mcp-server\n\n# Run development server \nnzrapi run --reload --port 8000\n\n# Database migrations\nnzrapi migrate -m \"Add user table\"\nnzrapi migrate --upgrade\n\n# Model management\nnzrapi models --list\nnzrapi models --add openai_gpt4 --type openai\n\n# Project info\nnzrapi info\n```\n\n## \ud83c\udf10 n8n Integration\n\nPerfect for n8n workflows with built-in MCP support:\n\n```json\n{\n \"nodes\": [{\n \"name\": \"AI Chat\",\n \"type\": \"n8n-nodes-base.httpRequest\",\n \"parameters\": {\n \"url\": \"http://your-api.com/api/v1/mcp/gpt4/predict\",\n \"method\": \"POST\",\n \"body\": {\n \"context_id\": \"{{ $json.session_id }}\",\n \"payload\": {\n \"message\": \"{{ $json.user_input }}\"\n }\n }\n }\n }]\n}\n```\n\n## \ud83d\udcca Monitoring & Observability\n\nBuilt-in monitoring capabilities:\n\n```python\n# Health checks\nGET /health\nGET /api/v1/models/{name}/health\n\n# Metrics\nGET /metrics\nGET /api/v1/stats\n\n# Usage analytics\nGET /api/v1/usage/models\nGET /api/v1/conversations/{context_id}\n```\n\n## \ud83d\udc33 Docker Deployment\n\nProduction-ready containers:\n\n```dockerfile\nFROM python:3.11-slim\nCOPY requirements.txt .\nRUN pip install -r requirements.txt\nCOPY . .\nEXPOSE 8000\nCMD [\"uvicorn\", \"main:app\", \"--host\", \"0.0.0.0\", \"--port\", \"8000\"]\n```\n\n```bash\n# Build and run\ndocker build -t my-ai-api .\ndocker run -p 8000:8000 my-ai-api\n\n# Or use docker-compose\ndocker-compose up -d\n```\n\n## \ud83d\udcda Examples\n\nCheck out our comprehensive examples:\n\n- [**Basic API**](examples/basic_api.py) - Simple AI API with chat functionality\n- [**Advanced Chatbot**](examples/ai_chatbot.py) - Full-featured chatbot with personality\n- [**n8n Integration**](examples/n8n_integration/) - Complete n8n workflow examples\n- [**Custom Models**](examples/custom_models/) - Implementing your own AI models\n\n## \ud83d\udcd6 Documentation\n\n- [**Quick Start Guide**](https://nzrapi.readthedocs.io/quickstart/)\n- [**API Reference**](https://nzrapi.readthedocs.io/api/)\n- [**AI Model Integration**](https://nzrapi.readthedocs.io/models/)\n- [**MCP Specification**](https://nzrapi.readthedocs.io/mcp/)\n- [**Deployment Guide**](https://nzrapi.readthedocs.io/deployment/)\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.\n\n```bash\n# Development setup\ngit clone https://github.com/nzrapi/nzrapi.git\ncd nzrapi\npip install -e \".[dev]\"\n\n# Run tests\npytest\n\n# Run linting\nblack .\nisort .\nflake8\n```\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## \ud83d\ude4f Acknowledgments\n\n- Built on the excellent FastAPI and Starlette foundations \n- Designed for seamless n8n integration\n- Community-driven development\n\n## \ud83d\udd17 Links\n\n- **Homepage**: [https://nzrapi.dev](https://nzrapi.dev)\n- **Documentation**: [https://nzrapi.readthedocs.io](https://nzrapi.readthedocs.io)\n- **PyPI**: [https://pypi.org/project/nzrapi/](https://pypi.org/project/nzrapi/)\n- **GitHub**: [https://github.com/nzrapi/nzrapi](https://github.com/nzrapi/nzrapi)\n- **Discord**: [https://discord.gg/nzrapi](https://discord.gg/nzrapi)\n\n---\n\n<div align=\"center\">\n\n**Built with \u2764\ufe0f for the AI community**\n\n*nzrApi Framework - Making AI APIs Simple and Powerful*\n\n</div>",
"bugtrack_url": null,
"license": "MIT",
"summary": "Modern async Python framework for AI APIs with native Model Context Protocol (MCP) support",
"version": "0.2.1",
"project_urls": {
"Bug Tracker": "https://github.com/nzrapi/nzrapi/issues",
"Changelog": "https://github.com/nzrapi/nzrapi/blob/main/CHANGELOG.md",
"Documentation": "https://nzrapi.readthedocs.io",
"Homepage": "https://github.com/nzrapi/nzrapi",
"Repository": "https://github.com/nzrapi/nzrapi"
},
"split_keywords": [
"ai",
" api",
" async",
" framework",
" machine-learning",
" mcp",
" n8n"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "f38b5daa4cc2202dc8dee6bef16f94c0479291cd22fc864334b030c07c8bcde3",
"md5": "f31fb984fa5e67f1930ea3d1b677281c",
"sha256": "63465dfaa8062f9ebc727cb3ac8c33a5c526bdd97099a612e60cea1dae9a006f"
},
"downloads": -1,
"filename": "nzrapi-0.2.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f31fb984fa5e67f1930ea3d1b677281c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 51537,
"upload_time": "2025-08-18T18:27:20",
"upload_time_iso_8601": "2025-08-18T18:27:20.293947Z",
"url": "https://files.pythonhosted.org/packages/f3/8b/5daa4cc2202dc8dee6bef16f94c0479291cd22fc864334b030c07c8bcde3/nzrapi-0.2.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "2bd0348484d3661a58af7f922d9f2c5d51cc7b97d6e458902f2d6a664c7e42ba",
"md5": "237e7175c244d119dba7e09c657c57d4",
"sha256": "0d5d7d2e2400ae569f6b1f50c2422ccb4ed22bdcb39ee4097a0b1f9caf80e1c1"
},
"downloads": -1,
"filename": "nzrapi-0.2.1.tar.gz",
"has_sig": false,
"md5_digest": "237e7175c244d119dba7e09c657c57d4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 88760,
"upload_time": "2025-08-18T18:27:22",
"upload_time_iso_8601": "2025-08-18T18:27:22.294215Z",
"url": "https://files.pythonhosted.org/packages/2b/d0/348484d3661a58af7f922d9f2c5d51cc7b97d6e458902f2d6a664c7e42ba/nzrapi-0.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-18 18:27:22",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "nzrapi",
"github_project": "nzrapi",
"github_not_found": true,
"lcname": "nzrapi"
}