# AI CacheKit
[](https://github.com/EDLadder/ai-cachekit/actions)
[](https://badge.fury.io/py/ai-cachekit)
[](https://opensource.org/licenses/MIT)
Lightweight caching library for AI/LLM API responses.
Reduce costs and improve performance by storing API responses locally with hash-based keys and optional TTL.
---
## Features
- 🔹 Simple API: `get`, `set`
- 🔹 Local JSON storage (no external DB required)
- 🔹 Optional TTL (time-to-live) for cache expiration
- 🔹 Perfect for OpenAI, Anthropic, Ollama, etc.
---
## Installation
**From GitHub (development version):**
```bash
pip install git+https://github.com/EDLadder/ai-cachekit.git
```
**From PyPI (after release):**
```bash
pip install ai-cachekit
```
---
## Usage
```python
from ai_cachekit import AIResponseCache
cache = AIResponseCache(backend="memory")
cache.set("question", "answer")
print(cache.get("question"))
```
### File-Based Cache (JSON)
```python
cache = AIResponseCache(backend="file", filepath="my_cache.json")
cache.set("key", "value")
print(cache.get("key"))
```
### Redis Cache
```python
cache = AIResponseCache(backend="redis", host="localhost", port=6379)
cache.set("key", "value")
print(cache.get("key"))
```
---
## Why?
- Avoid repeated API calls (save cost & time)
- Minimal dependencies and setup
- Flexible for any AI API (OpenAI, LLaMA, etc.)
---
## Development
Clone repo and install dev dependencies:
```bash
git clone https://github.com/EDLadder/ai-cachekit.git
cd ai-cachekit
pip install -r requirements.txt
pytest
```
---
## Plans
- [x] Support for Redis and SQLite backends.
- [ ] CLI tool for managing cache.
- [ ] Built-in stats: hit rate, saved cost estimation.
- [ ] Encryption for cached data.
---
## License
MIT License – free to use and modify.
Raw data
{
"_id": null,
"home_page": null,
"name": "ai-cachekit",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "cache, redis, ai, memory, file, sqlite",
"author": "Eugen D",
"author_email": "Eugen D <dascaleugen@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/d9/46/1f8710b1f44c1ece804cb5d7824174f1a6dd1629b3c772ed013bc0c3a0ee/ai_cachekit-1.0.0.tar.gz",
"platform": null,
"description": "# AI CacheKit\r\n\r\n[](https://github.com/EDLadder/ai-cachekit/actions)\r\n[](https://badge.fury.io/py/ai-cachekit)\r\n[](https://opensource.org/licenses/MIT)\r\n\r\nLightweight caching library for AI/LLM API responses. \r\nReduce costs and improve performance by storing API responses locally with hash-based keys and optional TTL.\r\n\r\n---\r\n\r\n## Features\r\n- \ud83d\udd39 Simple API: `get`, `set`\r\n- \ud83d\udd39 Local JSON storage (no external DB required)\r\n- \ud83d\udd39 Optional TTL (time-to-live) for cache expiration\r\n- \ud83d\udd39 Perfect for OpenAI, Anthropic, Ollama, etc.\r\n\r\n---\r\n\r\n## Installation\r\n\r\n**From GitHub (development version):**\r\n```bash\r\npip install git+https://github.com/EDLadder/ai-cachekit.git\r\n```\r\n\r\n**From PyPI (after release):**\r\n```bash\r\npip install ai-cachekit\r\n```\r\n\r\n---\r\n\r\n## Usage\r\n\r\n```python\r\nfrom ai_cachekit import AIResponseCache\r\n\r\ncache = AIResponseCache(backend=\"memory\")\r\ncache.set(\"question\", \"answer\")\r\nprint(cache.get(\"question\"))\r\n```\r\n\r\n### File-Based Cache (JSON)\r\n\r\n```python\r\ncache = AIResponseCache(backend=\"file\", filepath=\"my_cache.json\")\r\ncache.set(\"key\", \"value\")\r\nprint(cache.get(\"key\"))\r\n```\r\n\r\n### Redis Cache\r\n\r\n```python\r\ncache = AIResponseCache(backend=\"redis\", host=\"localhost\", port=6379)\r\ncache.set(\"key\", \"value\")\r\nprint(cache.get(\"key\"))\r\n```\r\n\r\n---\r\n\r\n## Why?\r\n- Avoid repeated API calls (save cost & time)\r\n- Minimal dependencies and setup\r\n- Flexible for any AI API (OpenAI, LLaMA, etc.)\r\n\r\n---\r\n\r\n## Development\r\n\r\nClone repo and install dev dependencies:\r\n```bash\r\ngit clone https://github.com/EDLadder/ai-cachekit.git\r\ncd ai-cachekit\r\npip install -r requirements.txt\r\npytest\r\n```\r\n\r\n---\r\n\r\n## Plans\r\n- [x] Support for Redis and SQLite backends.\r\n- [ ] CLI tool for managing cache.\r\n- [ ] Built-in stats: hit rate, saved cost estimation.\r\n- [ ] Encryption for cached data.\r\n\r\n---\r\n\r\n## License\r\nMIT License \u2013 free to use and modify.\r\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Universal caching system for AI responses with support for memory, file (JSON), and Redis backends.",
"version": "1.0.0",
"project_urls": {
"Homepage": "https://github.com/EDLadder/ai-cachekit",
"Repository": "https://github.com/EDLadder/ai-cachekit"
},
"split_keywords": [
"cache",
" redis",
" ai",
" memory",
" file",
" sqlite"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "38c89e792474379a390e51c593af8468dd04015535b9398c8b6d0911dfc0c3e0",
"md5": "793e94706f9847f181dcb7edf2e138a7",
"sha256": "7f27dcb6d1d9344bbb22ef57d5a48d77a8e40a3dd7e96a361be3030167c22b0c"
},
"downloads": -1,
"filename": "ai_cachekit-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "793e94706f9847f181dcb7edf2e138a7",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 4958,
"upload_time": "2025-08-07T10:33:02",
"upload_time_iso_8601": "2025-08-07T10:33:02.847635Z",
"url": "https://files.pythonhosted.org/packages/38/c8/9e792474379a390e51c593af8468dd04015535b9398c8b6d0911dfc0c3e0/ai_cachekit-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "d9461f8710b1f44c1ece804cb5d7824174f1a6dd1629b3c772ed013bc0c3a0ee",
"md5": "46682eb7a5eed4c9e8e1c6beda6aa178",
"sha256": "e07d8ca679fa9dbceeaa3d9477d3afafc36e36adc6f041121c03a6fadfe15dc9"
},
"downloads": -1,
"filename": "ai_cachekit-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "46682eb7a5eed4c9e8e1c6beda6aa178",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 4712,
"upload_time": "2025-08-07T10:33:04",
"upload_time_iso_8601": "2025-08-07T10:33:04.300249Z",
"url": "https://files.pythonhosted.org/packages/d9/46/1f8710b1f44c1ece804cb5d7824174f1a6dd1629b3c772ed013bc0c3a0ee/ai_cachekit-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-07 10:33:04",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "EDLadder",
"github_project": "ai-cachekit",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "redis",
"specs": [
[
">=",
"5.0.0"
]
]
},
{
"name": "pytest",
"specs": [
[
">=",
"7.0"
]
]
},
{
"name": "build",
"specs": [
[
">=",
"1.0"
]
]
},
{
"name": "twine",
"specs": [
[
">=",
"4.0"
]
]
}
],
"lcname": "ai-cachekit"
}