# PyAsyncSQL
**PyAsyncSQL** is an ultra-fast, asynchronous SQL layer that mimics MongoDB-style operations with full Redis caching and background write queue support.
Itโs designed for high-performance systems that need MongoDB-like flexibility over PostgreSQL.
---
## ๐ Features
- โก **Async I/O** โ Fully asynchronous using `aiopg` and `aioredis`
- ๐ง **MongoDB-like syntax** โ Use familiar methods like `find_one`, `insert_one`, `update_many`, etc.
- ๐ **Redis caching** โ Automatic query caching with TTL
- ๐งฉ **Background SQL Worker Queue** โ Batched inserts/updates/deletes to reduce I/O overhead
- ๐ **Pub/Sub Watcher** โ Real-time change streaming via Redis channels
- ๐ฅ **Automatic Retry + Reconnect** โ Fault-tolerant retry for transient SQL/Redis errors
- ๐งฐ **Dynamic Collections** โ Access collections as attributes or subscripts (e.g. `db.users` or `db["users"]`)
- ๐งน **Safe Shutdown** โ Waits for all pending operations before closing
---
## ๐งโ๐ป Installation
```bash
pip install pyasyncsql
```
---
## โ๏ธ Quick Start
```python
import asyncio
from pyasyncsql import AsyncSqlDB
DSN = "postgres://user:password@hostname:port/dbname?sslmode=require"
REDIS_URL = "redis://localhost:6379"
async def main():
db = AsyncSqlDB(DSN, REDIS_URL)
await db.connect()
users = db["users"]
# Insert
await users.insert_one({"name": "Alice", "age": 25})
# Find
user = await users.find_one({"name": "Alice"})
print(user)
# Update
await users.update_many({"name": "Alice"}, {"$set": {"age": 26}})
# Delete
await users.delete_one({"name": "Alice"})
# Close safely (waits for background queue to finish)
await db.close()
asyncio.run(main())
```
---
## ๐ Watcher (Real-Time Stream)
```python
import asyncio
async def listener(data):
print("Database change detected:", data)
async def watch_changes():
db = AsyncSqlDB(DSN, REDIS_URL)
await db.connect()
users = db["users"]
await users.watch(listener)
asyncio.run(watch_changes())
```
---
## ๐งฎ API Reference
### `AsyncSqlDB`
| Method | Description |
|--------|-------------|
| `connect()` | Initialize PostgreSQL + Redis connection |
| `close()` | Gracefully close and flush all background tasks |
| `__getitem__` / `__getattr__` | Access collection dynamically |
### `AsyncMongoLikeCollection`
| Method | Description |
|--------|-------------|
| `find_one(filter)` | Fetch one document |
| `find(filter)` | Return list of matching documents |
| `insert_one(doc)` | Insert a new document |
| `insert_many(docs)` | Insert multiple documents |
| `update_one(filter, update)` | Update a single document |
| `update_many(filter, update)` | Update multiple documents |
| `delete_one(filter)` | Delete a single document |
| `delete_many(filter)` | Delete multiple documents |
| `count_documents(filter)` | Count matching documents |
| `watch(callback)` | Listen for real-time change events via Redis |
---
## ๐งฑ Background Worker Queue
All write operations (insert/update/delete) are queued and flushed in batches to PostgreSQL, improving throughput dramatically under high load.
```python
await db.close() # ensures all batched operations are written before shutdown
```
---
## โ ๏ธ Error Handling
- Automatic retries with exponential backoff for SQL and Redis operations
- Transparent reconnects for transient connection failures
- Warnings are logged via Pythonโs `logging` module
---
## ๐งฐ Example Architecture
```text
โโโโโโโโโโโโโโโโโโ
โ Your App โ
โโโโโโโโฌโโโโโโโโโโ
โ
Async API Calls
โ
โโโโโโโโผโโโโโโโโโโ
โ PyAsyncSQL โ
โ (Mongo-like) โ
โโโโโโโโฌโโโโโโโโโโ
โ
โโโโโโโโโโโโดโโโโโโโโโโโโ
โ โ
โโโโโโผโโโโโโ โโโโโโผโโโโโ
โPostgreSQLโ โ Redis โ
โ(storage) โ โ(cache + โ
โ โ โ pub/sub)โ
โโโโโโโโโโโโ โโโโโโโโโโโ
```
---
## ๐งพ License
MIT License ยฉ 2025 Sathishzus
---
## ๐ฌ Author
**Sathishzus** โ Open Source Systems & Cloud Performance Tools
๐ [GitHub](https://github.com/sathishzuss) | ๐ [Website](https://sathishzus.qzz.io)
Raw data
{
"_id": null,
"home_page": null,
"name": "pyasyncsql",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "async, postgresql, redis, mongodb, jsonb, aiopg, aioredis",
"author": null,
"author_email": "sathish20004 <sourechebala19470@gmail.com>, Sathishzus <sathishzus@ridemap.in>",
"download_url": "https://files.pythonhosted.org/packages/c0/68/cfea30808499b820e3f475509bf407e76932af556b8f9540064e8ecd737e/pyasyncsql-0.1.2.tar.gz",
"platform": null,
"description": "# PyAsyncSQL\n\n**PyAsyncSQL** is an ultra-fast, asynchronous SQL layer that mimics MongoDB-style operations with full Redis caching and background write queue support. \nIt\u2019s designed for high-performance systems that need MongoDB-like flexibility over PostgreSQL.\n\n---\n\n## \ud83d\ude80 Features\n\n- \u26a1 **Async I/O** \u2013 Fully asynchronous using `aiopg` and `aioredis` \n- \ud83e\udde0 **MongoDB-like syntax** \u2013 Use familiar methods like `find_one`, `insert_one`, `update_many`, etc. \n- \ud83d\udd01 **Redis caching** \u2013 Automatic query caching with TTL \n- \ud83e\udde9 **Background SQL Worker Queue** \u2013 Batched inserts/updates/deletes to reduce I/O overhead \n- \ud83d\udd0e **Pub/Sub Watcher** \u2013 Real-time change streaming via Redis channels \n- \ud83d\udca5 **Automatic Retry + Reconnect** \u2013 Fault-tolerant retry for transient SQL/Redis errors \n- \ud83e\uddf0 **Dynamic Collections** \u2013 Access collections as attributes or subscripts (e.g. `db.users` or `db[\"users\"]`) \n- \ud83e\uddf9 **Safe Shutdown** \u2013 Waits for all pending operations before closing\n\n---\n\n## \ud83e\uddd1\u200d\ud83d\udcbb Installation\n\n```bash\npip install pyasyncsql\n```\n\n---\n\n## \u2699\ufe0f Quick Start\n\n```python\nimport asyncio\nfrom pyasyncsql import AsyncSqlDB\n\nDSN = \"postgres://user:password@hostname:port/dbname?sslmode=require\"\nREDIS_URL = \"redis://localhost:6379\"\n\nasync def main():\n db = AsyncSqlDB(DSN, REDIS_URL)\n await db.connect()\n\n users = db[\"users\"]\n\n # Insert\n await users.insert_one({\"name\": \"Alice\", \"age\": 25})\n\n # Find\n user = await users.find_one({\"name\": \"Alice\"})\n print(user)\n\n # Update\n await users.update_many({\"name\": \"Alice\"}, {\"$set\": {\"age\": 26}})\n\n # Delete\n await users.delete_one({\"name\": \"Alice\"})\n\n # Close safely (waits for background queue to finish)\n await db.close()\n\nasyncio.run(main())\n```\n\n---\n\n## \ud83d\udd04 Watcher (Real-Time Stream)\n\n```python\nimport asyncio\n\nasync def listener(data):\n print(\"Database change detected:\", data)\n\nasync def watch_changes():\n db = AsyncSqlDB(DSN, REDIS_URL)\n await db.connect()\n users = db[\"users\"]\n await users.watch(listener)\n\nasyncio.run(watch_changes())\n```\n\n---\n\n## \ud83e\uddee API Reference\n\n### `AsyncSqlDB`\n| Method | Description |\n|--------|-------------|\n| `connect()` | Initialize PostgreSQL + Redis connection |\n| `close()` | Gracefully close and flush all background tasks |\n| `__getitem__` / `__getattr__` | Access collection dynamically |\n\n### `AsyncMongoLikeCollection`\n| Method | Description |\n|--------|-------------|\n| `find_one(filter)` | Fetch one document |\n| `find(filter)` | Return list of matching documents |\n| `insert_one(doc)` | Insert a new document |\n| `insert_many(docs)` | Insert multiple documents |\n| `update_one(filter, update)` | Update a single document |\n| `update_many(filter, update)` | Update multiple documents |\n| `delete_one(filter)` | Delete a single document |\n| `delete_many(filter)` | Delete multiple documents |\n| `count_documents(filter)` | Count matching documents |\n| `watch(callback)` | Listen for real-time change events via Redis |\n\n---\n\n## \ud83e\uddf1 Background Worker Queue\n\nAll write operations (insert/update/delete) are queued and flushed in batches to PostgreSQL, improving throughput dramatically under high load.\n\n```python\nawait db.close() # ensures all batched operations are written before shutdown\n```\n\n---\n\n## \u26a0\ufe0f Error Handling\n\n- Automatic retries with exponential backoff for SQL and Redis operations \n- Transparent reconnects for transient connection failures \n- Warnings are logged via Python\u2019s `logging` module\n\n---\n\n## \ud83e\uddf0 Example Architecture\n\n```text\n \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n \u2502 Your App \u2502\n \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n \u2502\n Async API Calls\n \u2502\n \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u25bc\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n \u2502 PyAsyncSQL \u2502\n \u2502 (Mongo-like) \u2502\n \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n \u2502\n \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n \u2502 \u2502\n\u250c\u2500\u2500\u2500\u2500\u25bc\u2500\u2500\u2500\u2500\u2500\u2510 \u250c\u2500\u2500\u2500\u2500\u25bc\u2500\u2500\u2500\u2500\u2510\n\u2502PostgreSQL\u2502 \u2502 Redis \u2502\n\u2502(storage) \u2502 \u2502(cache + \u2502\n\u2502 \u2502 \u2502 pub/sub)\u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518 \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n```\n\n---\n\n## \ud83e\uddfe License\n\nMIT License \u00a9 2025 Sathishzus\n\n---\n\n## \ud83d\udcac Author\n\n**Sathishzus** \u2013 Open Source Systems & Cloud Performance Tools \n\ud83d\udd17 [GitHub](https://github.com/sathishzuss) | \ud83c\udf10 [Website](https://sathishzus.qzz.io)\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Async SQL + Redis wrapper with Mongo-like interface, fire-and-forget SQL background updates.",
"version": "0.1.2",
"project_urls": {
"Homepage": "https://github.com/sathishzuss/pyasyncsql"
},
"split_keywords": [
"async",
" postgresql",
" redis",
" mongodb",
" jsonb",
" aiopg",
" aioredis"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "919a70eb60aa4376e653a391130b5ec4e579a44eca00d65b06587c6b85bf2759",
"md5": "d9a95cee5c5478d82e25ede9169a06a3",
"sha256": "b246897281d40462ae07bb07666452af117474a40b8a19f6f784dc50959b338f"
},
"downloads": -1,
"filename": "pyasyncsql-0.1.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d9a95cee5c5478d82e25ede9169a06a3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 8446,
"upload_time": "2025-10-19T14:23:34",
"upload_time_iso_8601": "2025-10-19T14:23:34.699074Z",
"url": "https://files.pythonhosted.org/packages/91/9a/70eb60aa4376e653a391130b5ec4e579a44eca00d65b06587c6b85bf2759/pyasyncsql-0.1.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "c068cfea30808499b820e3f475509bf407e76932af556b8f9540064e8ecd737e",
"md5": "00134eda62ec506c60b779fa3573e33d",
"sha256": "c1058d1dcea810debff10deb76063f712ccf3837ab906d66280c6a3535e66e06"
},
"downloads": -1,
"filename": "pyasyncsql-0.1.2.tar.gz",
"has_sig": false,
"md5_digest": "00134eda62ec506c60b779fa3573e33d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 10107,
"upload_time": "2025-10-19T14:23:36",
"upload_time_iso_8601": "2025-10-19T14:23:36.674408Z",
"url": "https://files.pythonhosted.org/packages/c0/68/cfea30808499b820e3f475509bf407e76932af556b8f9540064e8ecd737e/pyasyncsql-0.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-19 14:23:36",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "sathishzuss",
"github_project": "pyasyncsql",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "aiopg",
"specs": []
},
{
"name": "redis",
"specs": [
[
">=",
"5.0.0"
]
]
}
],
"lcname": "pyasyncsql"
}