Name | cacheio JSON |
Version |
0.1.2
JSON |
| download |
home_page | https://github.com/bnlucas/cacheio |
Summary | cacheio is a flexible, user-friendly Python caching interface that unifies synchronous and asynchronous caching backends. It provides a consistent API for interacting with proven libraries—cachelib for synchronous caching and aiocache for asynchronous caching—allowing seamless integration of both styles in your applications. With configurable defaults, optional backend installation, and easy-to-use decorators, cacheio simplifies caching logic while minimizing dependencies. |
upload_time | 2025-08-08 03:38:41 |
maintainer | None |
docs_url | None |
author | Nathan Lucas |
requires_python | <4.0,>=3.10 |
license | MIT |
keywords |
cache
caching
asyncio
synchronous
aiohttp
cachelib
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
|
# cacheio
  
A flexible and user-friendly Python caching utility that provides a **unified interface** for both synchronous and asynchronous caching, **wrapping and integrating** two well-established caching libraries: [`cachelib`](https://github.com/pallets/cachelib) for synchronous caching, and [`aiocache`](https://github.com/aio-libs/aiocache) for asynchronous caching.
`cacheio` simplifies caching in Python applications by providing a consistent API for both sync and async use cases — no need to learn two different interfaces or manage separate dependencies manually. It intelligently loads only the backend dependencies you need.
---
## Overview 🚀
`cacheio` offers a unified caching interface for Python developers, abstracting away the differences between synchronous and asynchronous caching libraries. By wrapping [`cachelib`](https://github.com/pallets/cachelib) for sync caching and [`aiocache`](https://github.com/aio-libs/aiocache) for async caching, it lets you write caching logic that is clean, consistent, and easy to maintain.
---
## Installation
You can install `cacheio` via pip. It supports optional dependency groups for backend support.
### Basic Installation
Install core library without caching backends:
```bash
pip install cacheio
```
### Installing with Backends
- **Synchronous caching (cachelib-based):**
```bash
pip install "cacheio[sync]"
```
- **Asynchronous caching (aiocache-based):**
```bash
pip install "cacheio[async]"
```
- **Full installation (both sync and async):**
```bash
pip install "cacheio[full]"
```
---
## Quick Start
### Synchronous Caching
Use `CacheFactory.memory_cache()` to get a sync cache adapter backed by `cachelib`.
```python
from cacheio import CacheFactory
cache = CacheFactory.memory_cache()
cache.set("my_key", "my_value", ttl=300)
print(cache.get("my_key"))
```
### Asynchronous Caching
Use `CacheFactory.async_memory_cache()` to get an async cache adapter backed by `aiocache`.
```python
import asyncio
from cacheio import CacheFactory
async def main():
async_cache = CacheFactory.async_memory_cache()
await async_cache.set("my_async_key", "my_async_value", ttl=300)
val = await async_cache.get("my_async_key")
print(val)
asyncio.run(main())
```
---
## Using Decorators for Method Result Caching
`cacheio` provides four decorators to easily cache method results with minimal boilerplate:
- `@cached`: Sync decorator with automatic cache key generation.
- `@memoized`: Sync decorator with user-defined key function.
- `@async_cached`: Async decorator with automatic cache key generation.
- `@async_memoized`: Async decorator with user-defined async key function.
### 1. Synchronous `@cached`
Automatically caches method results using method arguments as the cache key.
```python
from cacheio import cached
from cacheio.mixins import Cacheable
class UserService(Cacheable):
@cached(ttl=60)
def fetch_user(self, user_id: int) -> dict:
print(f"Fetching user {user_id} from DB...")
return {"id": user_id, "name": f"User_{user_id}"}
service = UserService()
print(service.fetch_user(1)) # Runs and caches
print(service.fetch_user(1)) # Returns cached result
```
### 2. Synchronous `@memoized`
Allows a custom cache key function for more control.
```python
from cacheio import memoized
from cacheio.mixins import Cacheable
class UserService(Cacheable):
@memoized(key_fn=lambda self, user_id, **kwargs: f"user:{user_id}", ttl=60)
def fetch_user(self, user_id: int, request_id: str) -> dict:
print(f"Fetching user {user_id} with request {request_id}")
return {"id": user_id, "request": request_id}
service = UserService()
print(service.fetch_user(1, request_id="abc")) # Cached by user_id only
print(service.fetch_user(1, request_id="xyz")) # Returns cached result (same key)
```
### 3. Asynchronous `@async_cached`
Async version of `@cached`, for async methods.
```python
import asyncio
from cacheio import async_cached
from cacheio.mixins import AsyncCacheable
class AsyncUserService(AsyncCacheable):
@async_cached(ttl=60)
async def fetch_user(self, user_id: int) -> dict:
print(f"Fetching user {user_id} asynchronously...")
await asyncio.sleep(2)
return {"id": user_id, "name": f"User_{user_id}"}
async def main():
service = AsyncUserService()
print(await service.fetch_user(1)) # Runs and caches
print(await service.fetch_user(1)) # Returns cached result
asyncio.run(main())
```
### 4. Asynchronous `@async_memoized`
Async decorator with a custom async key function.
```python
import asyncio
from cacheio import async_memoized
from cacheio.mixins import AsyncCacheable
class AsyncUserService(AsyncCacheable):
@async_memoized(key_fn=lambda self, user_id, **kwargs: f"user:{user_id}", ttl=60)
async def fetch_user(self, user_id: int, request_id: str) -> dict:
print(f"Fetching user {user_id} with request {request_id} asynchronously...")
await asyncio.sleep(2)
return {"id": user_id, "request": request_id}
async def main():
service = AsyncUserService()
print(await service.fetch_user(1, request_id="abc")) # Cached
print(await service.fetch_user(1, request_id="xyz")) # Returns cached result
asyncio.run(main())
```
---
## Configuration
You can customize global caching behavior via the `config` object and the `configure()` function.
Example:
```python
from cacheio import config, configure
def update_settings(cfg):
cfg.default_ttl = 600
cfg.default_threshold = 1000
configure(update_settings)
```
This allows centralized control of defaults like TTL and cache size threshold.
---
## Contributing
Contributions are welcome! Feel free to open issues or submit pull requests on our [GitHub repository](https://github.com/bnlucas/cacheio).
---
## License
`cacheio` is distributed under the MIT license. See the [LICENSE](https://github.com/bnlucas/cacheio/blob/main/LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": "https://github.com/bnlucas/cacheio",
"name": "cacheio",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "cache, caching, asyncio, synchronous, aiohttp, cachelib",
"author": "Nathan Lucas",
"author_email": "nlucas@bnlucas.com",
"download_url": "https://files.pythonhosted.org/packages/56/07/79525a1f569f140d527784c4763cac32edcf512c40d57ac943588def00f8/cacheio-0.1.2.tar.gz",
"platform": null,
"description": "# cacheio\n  \n\nA flexible and user-friendly Python caching utility that provides a **unified interface** for both synchronous and asynchronous caching, **wrapping and integrating** two well-established caching libraries: [`cachelib`](https://github.com/pallets/cachelib) for synchronous caching, and [`aiocache`](https://github.com/aio-libs/aiocache) for asynchronous caching.\n\n`cacheio` simplifies caching in Python applications by providing a consistent API for both sync and async use cases \u2014 no need to learn two different interfaces or manage separate dependencies manually. It intelligently loads only the backend dependencies you need.\n\n---\n\n## Overview \ud83d\ude80\n\n`cacheio` offers a unified caching interface for Python developers, abstracting away the differences between synchronous and asynchronous caching libraries. By wrapping [`cachelib`](https://github.com/pallets/cachelib) for sync caching and [`aiocache`](https://github.com/aio-libs/aiocache) for async caching, it lets you write caching logic that is clean, consistent, and easy to maintain.\n\n---\n\n## Installation\n\nYou can install `cacheio` via pip. It supports optional dependency groups for backend support.\n\n### Basic Installation\n\nInstall core library without caching backends:\n\n```bash\npip install cacheio\n```\n\n### Installing with Backends\n\n- **Synchronous caching (cachelib-based):**\n\n```bash\npip install \"cacheio[sync]\"\n```\n\n- **Asynchronous caching (aiocache-based):**\n\n```bash\npip install \"cacheio[async]\"\n```\n\n- **Full installation (both sync and async):**\n\n```bash\npip install \"cacheio[full]\"\n```\n\n---\n\n## Quick Start\n\n### Synchronous Caching\n\nUse `CacheFactory.memory_cache()` to get a sync cache adapter backed by `cachelib`.\n\n```python\nfrom cacheio import CacheFactory\n\ncache = CacheFactory.memory_cache()\n\ncache.set(\"my_key\", \"my_value\", ttl=300)\nprint(cache.get(\"my_key\"))\n```\n\n### Asynchronous Caching\n\nUse `CacheFactory.async_memory_cache()` to get an async cache adapter backed by `aiocache`.\n\n```python\nimport asyncio\nfrom cacheio import CacheFactory\n\nasync def main():\n async_cache = CacheFactory.async_memory_cache()\n await async_cache.set(\"my_async_key\", \"my_async_value\", ttl=300)\n val = await async_cache.get(\"my_async_key\")\n print(val)\n\nasyncio.run(main())\n```\n\n---\n\n## Using Decorators for Method Result Caching\n\n`cacheio` provides four decorators to easily cache method results with minimal boilerplate:\n\n- `@cached`: Sync decorator with automatic cache key generation.\n- `@memoized`: Sync decorator with user-defined key function.\n- `@async_cached`: Async decorator with automatic cache key generation.\n- `@async_memoized`: Async decorator with user-defined async key function.\n\n### 1. Synchronous `@cached`\n\nAutomatically caches method results using method arguments as the cache key.\n\n```python\nfrom cacheio import cached\nfrom cacheio.mixins import Cacheable\n\nclass UserService(Cacheable):\n @cached(ttl=60)\n def fetch_user(self, user_id: int) -> dict:\n print(f\"Fetching user {user_id} from DB...\")\n return {\"id\": user_id, \"name\": f\"User_{user_id}\"}\n\nservice = UserService()\nprint(service.fetch_user(1)) # Runs and caches\nprint(service.fetch_user(1)) # Returns cached result\n```\n\n### 2. Synchronous `@memoized`\n\nAllows a custom cache key function for more control.\n\n```python\nfrom cacheio import memoized\nfrom cacheio.mixins import Cacheable\n\nclass UserService(Cacheable):\n @memoized(key_fn=lambda self, user_id, **kwargs: f\"user:{user_id}\", ttl=60)\n def fetch_user(self, user_id: int, request_id: str) -> dict:\n print(f\"Fetching user {user_id} with request {request_id}\")\n return {\"id\": user_id, \"request\": request_id}\n\nservice = UserService()\nprint(service.fetch_user(1, request_id=\"abc\")) # Cached by user_id only\nprint(service.fetch_user(1, request_id=\"xyz\")) # Returns cached result (same key)\n```\n\n### 3. Asynchronous `@async_cached`\n\nAsync version of `@cached`, for async methods.\n\n```python\nimport asyncio\nfrom cacheio import async_cached\nfrom cacheio.mixins import AsyncCacheable\n\nclass AsyncUserService(AsyncCacheable):\n @async_cached(ttl=60)\n async def fetch_user(self, user_id: int) -> dict:\n print(f\"Fetching user {user_id} asynchronously...\")\n await asyncio.sleep(2)\n return {\"id\": user_id, \"name\": f\"User_{user_id}\"}\n\nasync def main():\n service = AsyncUserService()\n print(await service.fetch_user(1)) # Runs and caches\n print(await service.fetch_user(1)) # Returns cached result\n\nasyncio.run(main())\n```\n\n### 4. Asynchronous `@async_memoized`\n\nAsync decorator with a custom async key function.\n\n```python\nimport asyncio\nfrom cacheio import async_memoized\nfrom cacheio.mixins import AsyncCacheable\n\nclass AsyncUserService(AsyncCacheable):\n @async_memoized(key_fn=lambda self, user_id, **kwargs: f\"user:{user_id}\", ttl=60)\n async def fetch_user(self, user_id: int, request_id: str) -> dict:\n print(f\"Fetching user {user_id} with request {request_id} asynchronously...\")\n await asyncio.sleep(2)\n return {\"id\": user_id, \"request\": request_id}\n\nasync def main():\n service = AsyncUserService()\n print(await service.fetch_user(1, request_id=\"abc\")) # Cached\n print(await service.fetch_user(1, request_id=\"xyz\")) # Returns cached result\n\nasyncio.run(main())\n```\n\n---\n\n## Configuration\n\nYou can customize global caching behavior via the `config` object and the `configure()` function.\n\nExample:\n\n```python\nfrom cacheio import config, configure\n\ndef update_settings(cfg):\n cfg.default_ttl = 600\n cfg.default_threshold = 1000\n\nconfigure(update_settings)\n```\n\nThis allows centralized control of defaults like TTL and cache size threshold.\n\n---\n\n## Contributing\n\nContributions are welcome! Feel free to open issues or submit pull requests on our [GitHub repository](https://github.com/bnlucas/cacheio).\n\n---\n\n## License\n\n`cacheio` is distributed under the MIT license. See the [LICENSE](https://github.com/bnlucas/cacheio/blob/main/LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "cacheio is a flexible, user-friendly Python caching interface that unifies synchronous and asynchronous caching backends. It provides a consistent API for interacting with proven libraries\u2014cachelib for synchronous caching and aiocache for asynchronous caching\u2014allowing seamless integration of both styles in your applications. With configurable defaults, optional backend installation, and easy-to-use decorators, cacheio simplifies caching logic while minimizing dependencies.",
"version": "0.1.2",
"project_urls": {
"Homepage": "https://github.com/bnlucas/cacheio",
"Repository": "https://github.com/bnlucas/cacheio"
},
"split_keywords": [
"cache",
" caching",
" asyncio",
" synchronous",
" aiohttp",
" cachelib"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ba535f162f196eeb8bc1178901bc8c32695b7706c765425ec38021e18c83395f",
"md5": "27e1a3fbe8299e90090b655ceccb909a",
"sha256": "8717dff475dcbbd5dfa138c257e0d9b6d067ba90f89a36c1cb48859a056fe9da"
},
"downloads": -1,
"filename": "cacheio-0.1.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "27e1a3fbe8299e90090b655ceccb909a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 17582,
"upload_time": "2025-08-08T03:38:40",
"upload_time_iso_8601": "2025-08-08T03:38:40.455585Z",
"url": "https://files.pythonhosted.org/packages/ba/53/5f162f196eeb8bc1178901bc8c32695b7706c765425ec38021e18c83395f/cacheio-0.1.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "560779525a1f569f140d527784c4763cac32edcf512c40d57ac943588def00f8",
"md5": "c86a739939115b1a49fb5708f94cc142",
"sha256": "2622d4de93276e569170f29daed14b9e2bb4e9cb00dd76e6dd1c1572c205f635"
},
"downloads": -1,
"filename": "cacheio-0.1.2.tar.gz",
"has_sig": false,
"md5_digest": "c86a739939115b1a49fb5708f94cc142",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 12189,
"upload_time": "2025-08-08T03:38:41",
"upload_time_iso_8601": "2025-08-08T03:38:41.269245Z",
"url": "https://files.pythonhosted.org/packages/56/07/79525a1f569f140d527784c4763cac32edcf512c40d57ac943588def00f8/cacheio-0.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-08 03:38:41",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "bnlucas",
"github_project": "cacheio",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"tox": true,
"lcname": "cacheio"
}