<div align="center">
<img src="https://raw.githubusercontent.com/ryancraigdavis/pydyno/main/assets/logo.png" alt="PyDyno Logo" width="200"/>
</div>
# PyDyno
> Unified connection pooling for Python, inspired by Netflix's Dyno
**PyDyno** is a modern, async-first connection pooling library that provides a unified interface for managing connections to databases, caches, and HTTP services. Built with `attrs` and designed for production use.
[](https://www.python.org/downloads/)
[](LICENSE)
[](https://www.attrs.org/)
## ๐ Quick Start
```python
import asyncio
from pydyno import PyDyno, PoolConfig
from pydyno.adapters.postgresql import PostgreSQLAdapter
async def main():
# Create PyDyno manager
dyno = PyDyno()
# Configure PostgreSQL pool
config = {
'host': 'localhost',
'user': 'postgres',
'password': 'password',
'database': 'mydb'
}
adapter = PostgreSQLAdapter(
name="main_db",
service_type="postgresql",
config=config,
pool_config=PoolConfig(max_connections=20)
)
# Add pool to manager
await dyno.create_pool("main_db", adapter)
# Use the pool
pool = dyno.get_pool("main_db")
async with pool.session_scope() as session:
result = await session.execute(text("SELECT version()"))
print(result.scalar())
# Cleanup
await dyno.close_all()
asyncio.run(main())
```
## โจ Features
- **๐ Unified Interface**: One consistent API for all service types
- **โก Async-First**: Built for modern async Python applications
- **๐ Built-in Metrics**: Track requests, response times, and health
- **๐ฅ Health Monitoring**: Automatic background health checks
- **๐ก๏ธ Production Ready**: Robust error handling and connection recovery
- **๐ง Highly Configurable**: Fine-tune connection pools for your needs
- **๐ฆ Clean Architecture**: Easy to extend with new adapters
## ๐ Supported Services
| Service | Status | Adapter |
|---------|--------|---------|
| **PostgreSQL** | โ
Ready | `PostgreSQLAdapter` |
| **Redis** | ๐ง Planned | `RedisAdapter` |
| **HTTP APIs** | ๐ง Planned | `HTTPAdapter` |
| **Kafka** | ๐ง Planned | `KafkaAdapter` |
## ๐ ๏ธ Installation
```bash
# Basic installation
pip install pydyno
# With PostgreSQL support
pip install pydyno[postgresql]
# Development installation
git clone https://github.com/yourusername/pydyno.git
cd pydyno
pip install -e .
```
## ๐ Documentation
### Basic Concepts
**PyDyno Manager**: Central coordinator that manages multiple connection pools
**Adapters**: Service-specific implementations (PostgreSQL, Redis, etc.)
**Pool Config**: Configuration for connection pool behavior
**Metrics**: Built-in monitoring and performance tracking
### Configuration
```python
from pydyno.core.pool_config import PoolConfig
# Customize pool behavior
pool_config = PoolConfig(
max_connections=20, # Maximum connections in pool
min_connections=2, # Minimum connections to maintain
max_overflow=30, # Additional connections beyond max
timeout=30.0, # Connection timeout in seconds
pool_recycle=3600, # Recycle connections after 1 hour
pool_pre_ping=True, # Verify connections before use
retry_attempts=3, # Retry failed operations
health_check_interval=60.0, # Health check frequency
echo=False # Log SQL queries (PostgreSQL)
)
```
### PostgreSQL Adapter
```python
from pydyno.adapters.postgresql import PostgreSQLAdapter, create_postgresql_adapter
# Method 1: Direct creation
adapter = PostgreSQLAdapter(
name="my_db",
service_type="postgresql",
config={
'host': 'localhost',
'port': 5432,
'user': 'postgres',
'password': 'password',
'database': 'myapp'
},
pool_config=PoolConfig(max_connections=10)
)
# Method 2: From environment variables
# Set: POSTGRES_HOST, POSTGRES_USER, POSTGRES_PASSWORD, POSTGRES_DB
adapter = create_postgresql_adapter("my_db")
# Method 3: From DATABASE_URL
# Set: DATABASE_URL=postgresql://user:pass@host:5432/db
adapter = create_postgresql_adapter("my_db")
```
### Usage Examples
#### FastAPI Integration
```python
from fastapi import FastAPI, Depends
from sqlalchemy.ext.asyncio import AsyncSession
from pydyno import PyDyno
from pydyno.adapters.postgresql import create_postgresql_adapter
# Global PyDyno instance
dyno = PyDyno()
async def startup_event():
"""Initialize database pool on startup"""
adapter = create_postgresql_adapter("main_db")
await dyno.create_pool("main_db", adapter)
async def shutdown_event():
"""Cleanup on shutdown"""
await dyno.close_all()
# FastAPI dependency
async def get_db_session() -> AsyncSession:
"""Get database session for routes"""
pool = dyno.get_pool("main_db")
return await pool.get_session()
# Use in routes
@app.get("/users/{user_id}")
async def get_user(user_id: int, db: AsyncSession = Depends(get_db_session)):
result = await db.execute(select(User).where(User.id == user_id))
return result.scalar_one_or_none()
# FastAPI app setup
app = FastAPI()
app.add_event_handler("startup", startup_event)
app.add_event_handler("shutdown", shutdown_event)
```
#### Session Management
```python
# Automatic transaction management
async with adapter.session_scope() as session:
# Create user
user = User(name="John", email="john@example.com")
session.add(user)
# Update user (same transaction)
user.last_login = datetime.utcnow()
# Automatically commits on success, rolls back on error
# Raw SQL queries
result = await adapter.execute_scalar("SELECT COUNT(*) FROM users")
print(f"Total users: {result}")
# Query with parameters
users = await adapter.execute_query(
"SELECT * FROM users WHERE created_at > :date",
{"date": datetime(2024, 1, 1)}
)
```
#### Health Monitoring
```python
# Check health of all pools
health_results = await dyno.health_check()
print(health_results) # {'main_db': True, 'cache': True}
# Check specific pool
is_healthy = await dyno.health_check("main_db")
# Get detailed metrics
metrics = await dyno.get_metrics_dict()
for pool_name, pool_metrics in metrics.items():
print(f"Pool: {pool_name}")
print(f" Total requests: {pool_metrics['total_requests']}")
print(f" Success rate: {pool_metrics['success_rate']:.1f}%")
print(f" Avg response time: {pool_metrics['average_response_time']:.3f}s")
print(f" Health: {pool_metrics['health_status']}")
```
#### Multiple Database Pools
```python
async def setup_multiple_databases():
dyno = PyDyno()
# Primary database
primary_adapter = PostgreSQLAdapter(
name="primary",
service_type="postgresql",
config={"url": "postgresql://user:pass@primary-db:5432/app"},
pool_config=PoolConfig(max_connections=20)
)
# Analytics database (read-only)
analytics_adapter = PostgreSQLAdapter(
name="analytics",
service_type="postgresql",
config={"url": "postgresql://user:pass@analytics-db:5432/analytics"},
pool_config=PoolConfig(max_connections=5, echo=True)
)
# Add both pools
await dyno.create_pool("primary", primary_adapter)
await dyno.create_pool("analytics", analytics_adapter)
return dyno
# Use different databases
async def get_user_analytics(dyno: PyDyno, user_id: int):
# Write to primary
primary = dyno.get_pool("primary")
async with primary.session_scope() as session:
user = User(id=user_id, name="John")
session.add(user)
# Read from analytics
analytics = dyno.get_pool("analytics")
result = await analytics.execute_scalar(
"SELECT COUNT(*) FROM user_events WHERE user_id = :user_id",
{"user_id": user_id}
)
return result
```
## ๐งช Testing
```bash
# Run basic functionality tests (no database required)
python test_pydyno_basic.py
# Run database tests (requires PostgreSQL)
export POSTGRES_HOST=localhost
export POSTGRES_USER=postgres
export POSTGRES_PASSWORD=password
export POSTGRES_DB=test_db
python test_pydyno_database.py
```
## ๐๏ธ Architecture
PyDyno follows a clean, extensible architecture:
```
pydyno/
โโโ core/
โ โโโ manager.py # PyDyno main manager
โ โโโ adapters.py # Base adapter interface
โ โโโ pool_config.py # Configuration classes
โ โโโ utils.py # Metrics and utilities
โ โโโ exceptions.py # Custom exceptions
โโโ adapters/
โโโ postgresql.py # PostgreSQL adapter
โโโ redis.py # Redis adapter (planned)
โโโ http.py # HTTP adapter (planned)
```
### Creating Custom Adapters
```python
from pydyno.core.adapters import ConnectionAdapter
class CustomAdapter(ConnectionAdapter):
"""Custom service adapter"""
async def initialize(self):
"""Set up your service connection pool"""
# Initialize your client/connection pool
self._client = YourServiceClient(
**self.config,
max_connections=self.pool_config.max_connections
)
self._initialized = True
async def health_check(self) -> bool:
"""Check service health"""
try:
await self._client.ping()
self.metrics.record_health_check(True)
return True
except Exception:
self.metrics.record_health_check(False)
return False
async def close(self):
"""Clean up resources"""
if self._client:
await self._client.close()
self._closed = True
```
## ๐ฏ Why PyDyno?
### The Problem
Modern applications often need to connect to multiple services:
- Primary database (PostgreSQL)
- Cache layer (Redis)
- External APIs (HTTP)
- Message queues (Kafka)
Each service has its own connection pooling mechanism, configuration format, and management approach. This leads to:
- **Inconsistent APIs** across your codebase
- **Scattered configuration** and monitoring
- **Duplicate connection management** logic
- **No unified health checking**
### The Solution
PyDyno provides a **single, unified interface** for all your connection pools:
- โ
**One API** for all services
- โ
**Consistent configuration** patterns
- โ
**Unified metrics** and monitoring
- โ
**Centralized health checking**
- โ
**Production-ready** error handling
### Inspired by Netflix
Netflix's [Dyno](https://github.com/Netflix/dyno) library solved this problem for Java applications at massive scale. PyDyno brings these same architectural patterns to Python, adapted for modern async applications.
## ๐ฎ Roadmap
### v0.2.0 - Redis Support
- Redis connection adapter
- Pub/Sub support
- Redis Cluster support
### v0.3.0 - HTTP Client Pooling
- HTTP adapter for API calls
- Load balancing strategies
- Circuit breaker pattern
### v0.4.0 - Advanced Features
- Kafka adapter
- Service discovery integration
- Prometheus metrics export
### v1.0.0 - Production Ready
- Comprehensive test suite
- Performance optimizations
- Full documentation
- Stability guarantees
## ๐ค Contributing
We welcome contributions! Areas where help is needed:
1. **New Adapters**: Redis, HTTP, Kafka, MongoDB
2. **Testing**: More test cases and edge cases
3. **Documentation**: Examples and tutorials
4. **Performance**: Benchmarks and optimizations
```bash
# Development setup
git clone https://github.com/yourusername/pydyno.git
cd pydyno
pip install -e ".[dev]"
# Run tests
python test_pydyno_basic.py
# Code formatting
black src/
isort src/
```
## ๐ License
MIT License - see [LICENSE](LICENSE) file for details.
## ๐ Acknowledgments
- Inspired by Netflix's [Dyno](https://github.com/Netflix/dyno) library
- Built with [attrs](https://www.attrs.org/) for clean Python classes
- Uses [SQLAlchemy](https://www.sqlalchemy.org/) for PostgreSQL support
## ๐ Support
- **Issues**: [GitHub Issues](https://github.com/ryancraigdavis/pydyno/issues)
- **Discussions**: [GitHub Discussions](https://github.com/ryancraigdavis/pydyno/discussions)
---
**PyDyno** - Making connection pooling simple, unified, and powerful. ๐
Raw data
{
"_id": null,
"home_page": null,
"name": "pydyno_pool",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.12",
"maintainer_email": null,
"keywords": "async, connection-pooling, database, netflix-dyno, postgresql, sqlalchemy",
"author": null,
"author_email": "Ryan Davis <ryancraigdavis@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/9e/af/09c6204ad0f75ad12128506c1c10e5fdae0f52b1d1e6165c663b02ee928e/pydyno_pool-0.1.3.tar.gz",
"platform": null,
"description": "<div align=\"center\">\n <img src=\"https://raw.githubusercontent.com/ryancraigdavis/pydyno/main/assets/logo.png\" alt=\"PyDyno Logo\" width=\"200\"/>\n</div>\n\n# PyDyno\n\n> Unified connection pooling for Python, inspired by Netflix's Dyno\n\n**PyDyno** is a modern, async-first connection pooling library that provides a unified interface for managing connections to databases, caches, and HTTP services. Built with `attrs` and designed for production use.\n\n[](https://www.python.org/downloads/)\n[](LICENSE)\n[](https://www.attrs.org/)\n\n## \ud83d\ude80 Quick Start\n\n```python\nimport asyncio\nfrom pydyno import PyDyno, PoolConfig\nfrom pydyno.adapters.postgresql import PostgreSQLAdapter\n\nasync def main():\n # Create PyDyno manager\n dyno = PyDyno()\n \n # Configure PostgreSQL pool\n config = {\n 'host': 'localhost',\n 'user': 'postgres',\n 'password': 'password',\n 'database': 'mydb'\n }\n \n adapter = PostgreSQLAdapter(\n name=\"main_db\",\n service_type=\"postgresql\",\n config=config,\n pool_config=PoolConfig(max_connections=20)\n )\n \n # Add pool to manager\n await dyno.create_pool(\"main_db\", adapter)\n \n # Use the pool\n pool = dyno.get_pool(\"main_db\")\n async with pool.session_scope() as session:\n result = await session.execute(text(\"SELECT version()\"))\n print(result.scalar())\n \n # Cleanup\n await dyno.close_all()\n\nasyncio.run(main())\n```\n\n## \u2728 Features\n\n- **\ud83d\udd04 Unified Interface**: One consistent API for all service types\n- **\u26a1 Async-First**: Built for modern async Python applications\n- **\ud83d\udcca Built-in Metrics**: Track requests, response times, and health\n- **\ud83c\udfe5 Health Monitoring**: Automatic background health checks\n- **\ud83d\udee1\ufe0f Production Ready**: Robust error handling and connection recovery\n- **\ud83d\udd27 Highly Configurable**: Fine-tune connection pools for your needs\n- **\ud83d\udce6 Clean Architecture**: Easy to extend with new adapters\n\n## \ud83d\udccb Supported Services\n\n| Service | Status | Adapter |\n|---------|--------|---------|\n| **PostgreSQL** | \u2705 Ready | `PostgreSQLAdapter` |\n| **Redis** | \ud83d\udea7 Planned | `RedisAdapter` |\n| **HTTP APIs** | \ud83d\udea7 Planned | `HTTPAdapter` |\n| **Kafka** | \ud83d\udea7 Planned | `KafkaAdapter` |\n\n## \ud83d\udee0\ufe0f Installation\n\n```bash\n# Basic installation\npip install pydyno\n\n# With PostgreSQL support\npip install pydyno[postgresql]\n\n# Development installation\ngit clone https://github.com/yourusername/pydyno.git\ncd pydyno\npip install -e .\n```\n\n## \ud83d\udcd6 Documentation\n\n### Basic Concepts\n\n**PyDyno Manager**: Central coordinator that manages multiple connection pools\n**Adapters**: Service-specific implementations (PostgreSQL, Redis, etc.)\n**Pool Config**: Configuration for connection pool behavior\n**Metrics**: Built-in monitoring and performance tracking\n\n### Configuration\n\n```python\nfrom pydyno.core.pool_config import PoolConfig\n\n# Customize pool behavior\npool_config = PoolConfig(\n max_connections=20, # Maximum connections in pool\n min_connections=2, # Minimum connections to maintain\n max_overflow=30, # Additional connections beyond max\n timeout=30.0, # Connection timeout in seconds\n pool_recycle=3600, # Recycle connections after 1 hour\n pool_pre_ping=True, # Verify connections before use\n retry_attempts=3, # Retry failed operations\n health_check_interval=60.0, # Health check frequency\n echo=False # Log SQL queries (PostgreSQL)\n)\n```\n\n### PostgreSQL Adapter\n\n```python\nfrom pydyno.adapters.postgresql import PostgreSQLAdapter, create_postgresql_adapter\n\n# Method 1: Direct creation\nadapter = PostgreSQLAdapter(\n name=\"my_db\",\n service_type=\"postgresql\",\n config={\n 'host': 'localhost',\n 'port': 5432,\n 'user': 'postgres',\n 'password': 'password',\n 'database': 'myapp'\n },\n pool_config=PoolConfig(max_connections=10)\n)\n\n# Method 2: From environment variables\n# Set: POSTGRES_HOST, POSTGRES_USER, POSTGRES_PASSWORD, POSTGRES_DB\nadapter = create_postgresql_adapter(\"my_db\")\n\n# Method 3: From DATABASE_URL\n# Set: DATABASE_URL=postgresql://user:pass@host:5432/db\nadapter = create_postgresql_adapter(\"my_db\")\n```\n\n### Usage Examples\n\n#### FastAPI Integration\n\n```python\nfrom fastapi import FastAPI, Depends\nfrom sqlalchemy.ext.asyncio import AsyncSession\nfrom pydyno import PyDyno\nfrom pydyno.adapters.postgresql import create_postgresql_adapter\n\n# Global PyDyno instance\ndyno = PyDyno()\n\nasync def startup_event():\n \"\"\"Initialize database pool on startup\"\"\"\n adapter = create_postgresql_adapter(\"main_db\")\n await dyno.create_pool(\"main_db\", adapter)\n\nasync def shutdown_event():\n \"\"\"Cleanup on shutdown\"\"\"\n await dyno.close_all()\n\n# FastAPI dependency\nasync def get_db_session() -> AsyncSession:\n \"\"\"Get database session for routes\"\"\"\n pool = dyno.get_pool(\"main_db\")\n return await pool.get_session()\n\n# Use in routes\n@app.get(\"/users/{user_id}\")\nasync def get_user(user_id: int, db: AsyncSession = Depends(get_db_session)):\n result = await db.execute(select(User).where(User.id == user_id))\n return result.scalar_one_or_none()\n\n# FastAPI app setup\napp = FastAPI()\napp.add_event_handler(\"startup\", startup_event)\napp.add_event_handler(\"shutdown\", shutdown_event)\n```\n\n#### Session Management\n\n```python\n# Automatic transaction management\nasync with adapter.session_scope() as session:\n # Create user\n user = User(name=\"John\", email=\"john@example.com\")\n session.add(user)\n \n # Update user (same transaction)\n user.last_login = datetime.utcnow()\n \n # Automatically commits on success, rolls back on error\n\n# Raw SQL queries\nresult = await adapter.execute_scalar(\"SELECT COUNT(*) FROM users\")\nprint(f\"Total users: {result}\")\n\n# Query with parameters\nusers = await adapter.execute_query(\n \"SELECT * FROM users WHERE created_at > :date\",\n {\"date\": datetime(2024, 1, 1)}\n)\n```\n\n#### Health Monitoring\n\n```python\n# Check health of all pools\nhealth_results = await dyno.health_check()\nprint(health_results) # {'main_db': True, 'cache': True}\n\n# Check specific pool\nis_healthy = await dyno.health_check(\"main_db\")\n\n# Get detailed metrics\nmetrics = await dyno.get_metrics_dict()\nfor pool_name, pool_metrics in metrics.items():\n print(f\"Pool: {pool_name}\")\n print(f\" Total requests: {pool_metrics['total_requests']}\")\n print(f\" Success rate: {pool_metrics['success_rate']:.1f}%\")\n print(f\" Avg response time: {pool_metrics['average_response_time']:.3f}s\")\n print(f\" Health: {pool_metrics['health_status']}\")\n```\n\n#### Multiple Database Pools\n\n```python\nasync def setup_multiple_databases():\n dyno = PyDyno()\n \n # Primary database\n primary_adapter = PostgreSQLAdapter(\n name=\"primary\",\n service_type=\"postgresql\",\n config={\"url\": \"postgresql://user:pass@primary-db:5432/app\"},\n pool_config=PoolConfig(max_connections=20)\n )\n \n # Analytics database (read-only)\n analytics_adapter = PostgreSQLAdapter(\n name=\"analytics\",\n service_type=\"postgresql\", \n config={\"url\": \"postgresql://user:pass@analytics-db:5432/analytics\"},\n pool_config=PoolConfig(max_connections=5, echo=True)\n )\n \n # Add both pools\n await dyno.create_pool(\"primary\", primary_adapter)\n await dyno.create_pool(\"analytics\", analytics_adapter)\n \n return dyno\n\n# Use different databases\nasync def get_user_analytics(dyno: PyDyno, user_id: int):\n # Write to primary\n primary = dyno.get_pool(\"primary\")\n async with primary.session_scope() as session:\n user = User(id=user_id, name=\"John\")\n session.add(user)\n \n # Read from analytics\n analytics = dyno.get_pool(\"analytics\")\n result = await analytics.execute_scalar(\n \"SELECT COUNT(*) FROM user_events WHERE user_id = :user_id\",\n {\"user_id\": user_id}\n )\n \n return result\n```\n\n## \ud83e\uddea Testing\n\n```bash\n# Run basic functionality tests (no database required)\npython test_pydyno_basic.py\n\n# Run database tests (requires PostgreSQL)\nexport POSTGRES_HOST=localhost\nexport POSTGRES_USER=postgres\nexport POSTGRES_PASSWORD=password\nexport POSTGRES_DB=test_db\npython test_pydyno_database.py\n```\n\n## \ud83c\udfd7\ufe0f Architecture\n\nPyDyno follows a clean, extensible architecture:\n\n```\npydyno/\n\u251c\u2500\u2500 core/\n\u2502 \u251c\u2500\u2500 manager.py # PyDyno main manager\n\u2502 \u251c\u2500\u2500 adapters.py # Base adapter interface\n\u2502 \u251c\u2500\u2500 pool_config.py # Configuration classes\n\u2502 \u251c\u2500\u2500 utils.py # Metrics and utilities\n\u2502 \u2514\u2500\u2500 exceptions.py # Custom exceptions\n\u2514\u2500\u2500 adapters/\n \u251c\u2500\u2500 postgresql.py # PostgreSQL adapter\n \u251c\u2500\u2500 redis.py # Redis adapter (planned)\n \u2514\u2500\u2500 http.py # HTTP adapter (planned)\n```\n\n### Creating Custom Adapters\n\n```python\nfrom pydyno.core.adapters import ConnectionAdapter\n\nclass CustomAdapter(ConnectionAdapter):\n \"\"\"Custom service adapter\"\"\"\n \n async def initialize(self):\n \"\"\"Set up your service connection pool\"\"\"\n # Initialize your client/connection pool\n self._client = YourServiceClient(\n **self.config,\n max_connections=self.pool_config.max_connections\n )\n self._initialized = True\n \n async def health_check(self) -> bool:\n \"\"\"Check service health\"\"\"\n try:\n await self._client.ping()\n self.metrics.record_health_check(True)\n return True\n except Exception:\n self.metrics.record_health_check(False)\n return False\n \n async def close(self):\n \"\"\"Clean up resources\"\"\"\n if self._client:\n await self._client.close()\n self._closed = True\n```\n\n## \ud83c\udfaf Why PyDyno?\n\n### The Problem\nModern applications often need to connect to multiple services:\n- Primary database (PostgreSQL)\n- Cache layer (Redis)\n- External APIs (HTTP)\n- Message queues (Kafka)\n\nEach service has its own connection pooling mechanism, configuration format, and management approach. This leads to:\n- **Inconsistent APIs** across your codebase\n- **Scattered configuration** and monitoring\n- **Duplicate connection management** logic\n- **No unified health checking**\n\n### The Solution\nPyDyno provides a **single, unified interface** for all your connection pools:\n- \u2705 **One API** for all services\n- \u2705 **Consistent configuration** patterns\n- \u2705 **Unified metrics** and monitoring\n- \u2705 **Centralized health checking**\n- \u2705 **Production-ready** error handling\n\n### Inspired by Netflix\nNetflix's [Dyno](https://github.com/Netflix/dyno) library solved this problem for Java applications at massive scale. PyDyno brings these same architectural patterns to Python, adapted for modern async applications.\n\n## \ud83d\udd2e Roadmap\n\n### v0.2.0 - Redis Support\n- Redis connection adapter\n- Pub/Sub support\n- Redis Cluster support\n\n### v0.3.0 - HTTP Client Pooling\n- HTTP adapter for API calls\n- Load balancing strategies\n- Circuit breaker pattern\n\n### v0.4.0 - Advanced Features\n- Kafka adapter\n- Service discovery integration\n- Prometheus metrics export\n\n### v1.0.0 - Production Ready\n- Comprehensive test suite\n- Performance optimizations\n- Full documentation\n- Stability guarantees\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! Areas where help is needed:\n\n1. **New Adapters**: Redis, HTTP, Kafka, MongoDB\n2. **Testing**: More test cases and edge cases\n3. **Documentation**: Examples and tutorials\n4. **Performance**: Benchmarks and optimizations\n\n```bash\n# Development setup\ngit clone https://github.com/yourusername/pydyno.git\ncd pydyno\npip install -e \".[dev]\"\n\n# Run tests\npython test_pydyno_basic.py\n\n# Code formatting\nblack src/\nisort src/\n```\n\n## \ud83d\udcc4 License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n\n## \ud83d\ude4f Acknowledgments\n\n- Inspired by Netflix's [Dyno](https://github.com/Netflix/dyno) library\n- Built with [attrs](https://www.attrs.org/) for clean Python classes\n- Uses [SQLAlchemy](https://www.sqlalchemy.org/) for PostgreSQL support\n\n## \ud83d\udcde Support\n\n- **Issues**: [GitHub Issues](https://github.com/ryancraigdavis/pydyno/issues)\n- **Discussions**: [GitHub Discussions](https://github.com/ryancraigdavis/pydyno/discussions)\n\n---\n\n**PyDyno** - Making connection pooling simple, unified, and powerful. \ud83d\ude80\n",
"bugtrack_url": null,
"license": null,
"summary": "A simple pool connection manager based off of Netflix's Dyno package, but for Python",
"version": "0.1.3",
"project_urls": {
"Homepage": "https://github.com/ryancraigdavis/PyDyno",
"Issues": "https://github.com/ryancraigdavis/PyDyno/issues"
},
"split_keywords": [
"async",
" connection-pooling",
" database",
" netflix-dyno",
" postgresql",
" sqlalchemy"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "51fdff33817d1aea5d756aac69763168bd93a942774205bdc3e83ef1e2974c54",
"md5": "24ac2256a987af59f22a1ef1267cb101",
"sha256": "f422d13049a595f4944d3a5a56d557a95ecb36fe2d353e6988d8bda255238729"
},
"downloads": -1,
"filename": "pydyno_pool-0.1.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "24ac2256a987af59f22a1ef1267cb101",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.12",
"size": 24063,
"upload_time": "2025-07-12T02:15:44",
"upload_time_iso_8601": "2025-07-12T02:15:44.836338Z",
"url": "https://files.pythonhosted.org/packages/51/fd/ff33817d1aea5d756aac69763168bd93a942774205bdc3e83ef1e2974c54/pydyno_pool-0.1.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "9eaf09c6204ad0f75ad12128506c1c10e5fdae0f52b1d1e6165c663b02ee928e",
"md5": "d272b9696766437f54b32c6f9bb8523a",
"sha256": "6ae9bd3f3ea09edf0d16f61349e4ce9fd1807094a35f4be6a972532815439188"
},
"downloads": -1,
"filename": "pydyno_pool-0.1.3.tar.gz",
"has_sig": false,
"md5_digest": "d272b9696766437f54b32c6f9bb8523a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.12",
"size": 1433641,
"upload_time": "2025-07-12T02:15:47",
"upload_time_iso_8601": "2025-07-12T02:15:47.417070Z",
"url": "https://files.pythonhosted.org/packages/9e/af/09c6204ad0f75ad12128506c1c10e5fdae0f52b1d1e6165c663b02ee928e/pydyno_pool-0.1.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-12 02:15:47",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ryancraigdavis",
"github_project": "PyDyno",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "pydyno_pool"
}