pyasyncsql


Namepyasyncsql JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryAsync SQL + Redis wrapper with Mongo-like interface, fire-and-forget SQL background updates.
upload_time2025-10-19 14:23:36
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT
keywords async postgresql redis mongodb jsonb aiopg aioredis
VCS
bugtrack_url
requirements aiopg redis
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PyAsyncSQL

**PyAsyncSQL** is an ultra-fast, asynchronous SQL layer that mimics MongoDB-style operations with full Redis caching and background write queue support.  
Itโ€™s designed for high-performance systems that need MongoDB-like flexibility over PostgreSQL.

---

## ๐Ÿš€ Features

- โšก **Async I/O** โ€“ Fully asynchronous using `aiopg` and `aioredis`  
- ๐Ÿง  **MongoDB-like syntax** โ€“ Use familiar methods like `find_one`, `insert_one`, `update_many`, etc.  
- ๐Ÿ” **Redis caching** โ€“ Automatic query caching with TTL  
- ๐Ÿงฉ **Background SQL Worker Queue** โ€“ Batched inserts/updates/deletes to reduce I/O overhead  
- ๐Ÿ”Ž **Pub/Sub Watcher** โ€“ Real-time change streaming via Redis channels  
- ๐Ÿ’ฅ **Automatic Retry + Reconnect** โ€“ Fault-tolerant retry for transient SQL/Redis errors  
- ๐Ÿงฐ **Dynamic Collections** โ€“ Access collections as attributes or subscripts (e.g. `db.users` or `db["users"]`)  
- ๐Ÿงน **Safe Shutdown** โ€“ Waits for all pending operations before closing

---

## ๐Ÿง‘โ€๐Ÿ’ป Installation

```bash
pip install pyasyncsql
```

---

## โš™๏ธ Quick Start

```python
import asyncio
from pyasyncsql import AsyncSqlDB

DSN = "postgres://user:password@hostname:port/dbname?sslmode=require"
REDIS_URL = "redis://localhost:6379"

async def main():
    db = AsyncSqlDB(DSN, REDIS_URL)
    await db.connect()

    users = db["users"]

    # Insert
    await users.insert_one({"name": "Alice", "age": 25})

    # Find
    user = await users.find_one({"name": "Alice"})
    print(user)

    # Update
    await users.update_many({"name": "Alice"}, {"$set": {"age": 26}})

    # Delete
    await users.delete_one({"name": "Alice"})

    # Close safely (waits for background queue to finish)
    await db.close()

asyncio.run(main())
```

---

## ๐Ÿ”„ Watcher (Real-Time Stream)

```python
import asyncio

async def listener(data):
    print("Database change detected:", data)

async def watch_changes():
    db = AsyncSqlDB(DSN, REDIS_URL)
    await db.connect()
    users = db["users"]
    await users.watch(listener)

asyncio.run(watch_changes())
```

---

## ๐Ÿงฎ API Reference

### `AsyncSqlDB`
| Method | Description |
|--------|-------------|
| `connect()` | Initialize PostgreSQL + Redis connection |
| `close()` | Gracefully close and flush all background tasks |
| `__getitem__` / `__getattr__` | Access collection dynamically |

### `AsyncMongoLikeCollection`
| Method | Description |
|--------|-------------|
| `find_one(filter)` | Fetch one document |
| `find(filter)` | Return list of matching documents |
| `insert_one(doc)` | Insert a new document |
| `insert_many(docs)` | Insert multiple documents |
| `update_one(filter, update)` | Update a single document |
| `update_many(filter, update)` | Update multiple documents |
| `delete_one(filter)` | Delete a single document |
| `delete_many(filter)` | Delete multiple documents |
| `count_documents(filter)` | Count matching documents |
| `watch(callback)` | Listen for real-time change events via Redis |

---

## ๐Ÿงฑ Background Worker Queue

All write operations (insert/update/delete) are queued and flushed in batches to PostgreSQL, improving throughput dramatically under high load.

```python
await db.close()  # ensures all batched operations are written before shutdown
```

---

## โš ๏ธ Error Handling

- Automatic retries with exponential backoff for SQL and Redis operations  
- Transparent reconnects for transient connection failures  
- Warnings are logged via Pythonโ€™s `logging` module

---

## ๐Ÿงฐ Example Architecture

```text
         โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
         โ”‚   Your App     โ”‚
         โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                โ”‚
        Async API Calls
                โ”‚
         โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
         โ”‚  PyAsyncSQL    โ”‚
         โ”‚  (Mongo-like)  โ”‚
         โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                โ”‚
     โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
     โ”‚                      โ”‚
โ”Œโ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”€โ”           โ”Œโ”€โ”€โ”€โ”€โ–ผโ”€โ”€โ”€โ”€โ”
โ”‚PostgreSQLโ”‚           โ”‚ Redis   โ”‚
โ”‚(storage) โ”‚           โ”‚(cache + โ”‚
โ”‚          โ”‚           โ”‚ pub/sub)โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜           โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
```

---

## ๐Ÿงพ License

MIT License ยฉ 2025 Sathishzus

---

## ๐Ÿ’ฌ Author

**Sathishzus** โ€“ Open Source Systems & Cloud Performance Tools  
๐Ÿ”— [GitHub](https://github.com/sathishzuss) | ๐ŸŒ [Website](https://sathishzus.qzz.io)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "pyasyncsql",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "async, postgresql, redis, mongodb, jsonb, aiopg, aioredis",
    "author": null,
    "author_email": "sathish20004 <sourechebala19470@gmail.com>, Sathishzus <sathishzus@ridemap.in>",
    "download_url": "https://files.pythonhosted.org/packages/c0/68/cfea30808499b820e3f475509bf407e76932af556b8f9540064e8ecd737e/pyasyncsql-0.1.2.tar.gz",
    "platform": null,
    "description": "# PyAsyncSQL\n\n**PyAsyncSQL** is an ultra-fast, asynchronous SQL layer that mimics MongoDB-style operations with full Redis caching and background write queue support.  \nIt\u2019s designed for high-performance systems that need MongoDB-like flexibility over PostgreSQL.\n\n---\n\n## \ud83d\ude80 Features\n\n- \u26a1 **Async I/O** \u2013 Fully asynchronous using `aiopg` and `aioredis`  \n- \ud83e\udde0 **MongoDB-like syntax** \u2013 Use familiar methods like `find_one`, `insert_one`, `update_many`, etc.  \n- \ud83d\udd01 **Redis caching** \u2013 Automatic query caching with TTL  \n- \ud83e\udde9 **Background SQL Worker Queue** \u2013 Batched inserts/updates/deletes to reduce I/O overhead  \n- \ud83d\udd0e **Pub/Sub Watcher** \u2013 Real-time change streaming via Redis channels  \n- \ud83d\udca5 **Automatic Retry + Reconnect** \u2013 Fault-tolerant retry for transient SQL/Redis errors  \n- \ud83e\uddf0 **Dynamic Collections** \u2013 Access collections as attributes or subscripts (e.g. `db.users` or `db[\"users\"]`)  \n- \ud83e\uddf9 **Safe Shutdown** \u2013 Waits for all pending operations before closing\n\n---\n\n## \ud83e\uddd1\u200d\ud83d\udcbb Installation\n\n```bash\npip install pyasyncsql\n```\n\n---\n\n## \u2699\ufe0f Quick Start\n\n```python\nimport asyncio\nfrom pyasyncsql import AsyncSqlDB\n\nDSN = \"postgres://user:password@hostname:port/dbname?sslmode=require\"\nREDIS_URL = \"redis://localhost:6379\"\n\nasync def main():\n    db = AsyncSqlDB(DSN, REDIS_URL)\n    await db.connect()\n\n    users = db[\"users\"]\n\n    # Insert\n    await users.insert_one({\"name\": \"Alice\", \"age\": 25})\n\n    # Find\n    user = await users.find_one({\"name\": \"Alice\"})\n    print(user)\n\n    # Update\n    await users.update_many({\"name\": \"Alice\"}, {\"$set\": {\"age\": 26}})\n\n    # Delete\n    await users.delete_one({\"name\": \"Alice\"})\n\n    # Close safely (waits for background queue to finish)\n    await db.close()\n\nasyncio.run(main())\n```\n\n---\n\n## \ud83d\udd04 Watcher (Real-Time Stream)\n\n```python\nimport asyncio\n\nasync def listener(data):\n    print(\"Database change detected:\", data)\n\nasync def watch_changes():\n    db = AsyncSqlDB(DSN, REDIS_URL)\n    await db.connect()\n    users = db[\"users\"]\n    await users.watch(listener)\n\nasyncio.run(watch_changes())\n```\n\n---\n\n## \ud83e\uddee API Reference\n\n### `AsyncSqlDB`\n| Method | Description |\n|--------|-------------|\n| `connect()` | Initialize PostgreSQL + Redis connection |\n| `close()` | Gracefully close and flush all background tasks |\n| `__getitem__` / `__getattr__` | Access collection dynamically |\n\n### `AsyncMongoLikeCollection`\n| Method | Description |\n|--------|-------------|\n| `find_one(filter)` | Fetch one document |\n| `find(filter)` | Return list of matching documents |\n| `insert_one(doc)` | Insert a new document |\n| `insert_many(docs)` | Insert multiple documents |\n| `update_one(filter, update)` | Update a single document |\n| `update_many(filter, update)` | Update multiple documents |\n| `delete_one(filter)` | Delete a single document |\n| `delete_many(filter)` | Delete multiple documents |\n| `count_documents(filter)` | Count matching documents |\n| `watch(callback)` | Listen for real-time change events via Redis |\n\n---\n\n## \ud83e\uddf1 Background Worker Queue\n\nAll write operations (insert/update/delete) are queued and flushed in batches to PostgreSQL, improving throughput dramatically under high load.\n\n```python\nawait db.close()  # ensures all batched operations are written before shutdown\n```\n\n---\n\n## \u26a0\ufe0f Error Handling\n\n- Automatic retries with exponential backoff for SQL and Redis operations  \n- Transparent reconnects for transient connection failures  \n- Warnings are logged via Python\u2019s `logging` module\n\n---\n\n## \ud83e\uddf0 Example Architecture\n\n```text\n         \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n         \u2502   Your App     \u2502\n         \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n                \u2502\n        Async API Calls\n                \u2502\n         \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u25bc\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n         \u2502  PyAsyncSQL    \u2502\n         \u2502  (Mongo-like)  \u2502\n         \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n                \u2502\n     \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n     \u2502                      \u2502\n\u250c\u2500\u2500\u2500\u2500\u25bc\u2500\u2500\u2500\u2500\u2500\u2510           \u250c\u2500\u2500\u2500\u2500\u25bc\u2500\u2500\u2500\u2500\u2510\n\u2502PostgreSQL\u2502           \u2502 Redis   \u2502\n\u2502(storage) \u2502           \u2502(cache + \u2502\n\u2502          \u2502           \u2502 pub/sub)\u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518           \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n```\n\n---\n\n## \ud83e\uddfe License\n\nMIT License \u00a9 2025 Sathishzus\n\n---\n\n## \ud83d\udcac Author\n\n**Sathishzus** \u2013 Open Source Systems & Cloud Performance Tools  \n\ud83d\udd17 [GitHub](https://github.com/sathishzuss) | \ud83c\udf10 [Website](https://sathishzus.qzz.io)\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Async SQL + Redis wrapper with Mongo-like interface, fire-and-forget SQL background updates.",
    "version": "0.1.2",
    "project_urls": {
        "Homepage": "https://github.com/sathishzuss/pyasyncsql"
    },
    "split_keywords": [
        "async",
        " postgresql",
        " redis",
        " mongodb",
        " jsonb",
        " aiopg",
        " aioredis"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "919a70eb60aa4376e653a391130b5ec4e579a44eca00d65b06587c6b85bf2759",
                "md5": "d9a95cee5c5478d82e25ede9169a06a3",
                "sha256": "b246897281d40462ae07bb07666452af117474a40b8a19f6f784dc50959b338f"
            },
            "downloads": -1,
            "filename": "pyasyncsql-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d9a95cee5c5478d82e25ede9169a06a3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 8446,
            "upload_time": "2025-10-19T14:23:34",
            "upload_time_iso_8601": "2025-10-19T14:23:34.699074Z",
            "url": "https://files.pythonhosted.org/packages/91/9a/70eb60aa4376e653a391130b5ec4e579a44eca00d65b06587c6b85bf2759/pyasyncsql-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "c068cfea30808499b820e3f475509bf407e76932af556b8f9540064e8ecd737e",
                "md5": "00134eda62ec506c60b779fa3573e33d",
                "sha256": "c1058d1dcea810debff10deb76063f712ccf3837ab906d66280c6a3535e66e06"
            },
            "downloads": -1,
            "filename": "pyasyncsql-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "00134eda62ec506c60b779fa3573e33d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 10107,
            "upload_time": "2025-10-19T14:23:36",
            "upload_time_iso_8601": "2025-10-19T14:23:36.674408Z",
            "url": "https://files.pythonhosted.org/packages/c0/68/cfea30808499b820e3f475509bf407e76932af556b8f9540064e8ecd737e/pyasyncsql-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-19 14:23:36",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "sathishzuss",
    "github_project": "pyasyncsql",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "aiopg",
            "specs": []
        },
        {
            "name": "redis",
            "specs": [
                [
                    ">=",
                    "5.0.0"
                ]
            ]
        }
    ],
    "lcname": "pyasyncsql"
}
        
Elapsed time: 4.17837s