Name | dataknobs-kv JSON |
Version |
0.1.0
JSON |
| download |
home_page | None |
Summary | Key/value store abstraction with hierarchical keys, pattern matching, and metadata support |
upload_time | 2025-08-18 03:33:50 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | None |
keywords |
cache
database
key-value
kv
storage
store
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# DataKnobs KV Package
A powerful key/value store abstraction with hierarchical keys, pattern matching, and rich metadata support, built on top of the dataknobs-data package.
## Overview
The `dataknobs-kv` package provides a simple yet powerful interface for managing key/value pairs across multiple storage backends. It features hierarchical key structures, advanced pattern matching, metadata management, and seamless integration with various storage technologies through the dataknobs-data abstraction layer.
## Features
- **Hierarchical Keys**: Dot-notation key paths for organized data structure
- **Pattern Matching**: Powerful glob-style patterns for key operations
- **Rich Metadata**: Associate custom metadata with every key/value pair
- **Multiple Backends**: Memory, File, PostgreSQL, Elasticsearch, S3, and more
- **TTL Support**: Automatic expiration for time-sensitive data
- **Atomic Operations**: Compare-and-swap, increment, append operations
- **Namespace Isolation**: Multi-tenant support with isolated namespaces
- **Type Safety**: Strong typing with automatic serialization/deserialization
## Installation
```bash
# Basic installation
pip install dataknobs-kv
# With caching support
pip install dataknobs-kv[cache]
# With all features
pip install dataknobs-kv[all]
```
## Quick Start
```python
from dataknobs_kv import KVStore
# Create a store with memory backend
store = KVStore(backend="memory")
# Basic operations
await store.set("app.name", "MyApplication")
await store.set("app.version", "1.0.0")
await store.set("app.config.debug", True)
# Get values
name = await store.get("app.name")
debug = await store.get("app.config.debug")
# Pattern matching
app_keys = await store.keys("app.*")
all_config = await store.get_pattern("app.config.*")
# Delete keys
await store.delete("app.config.debug")
await store.delete_pattern("app.temp.*")
```
## Hierarchical Key Structure
```python
# Organize data with dot-notation paths
await store.set("users.123.name", "John Doe")
await store.set("users.123.email", "john@example.com")
await store.set("users.123.settings.theme", "dark")
await store.set("users.123.settings.notifications", True)
# Get all user data
user_data = await store.get_pattern("users.123.*")
# Get all user settings
user_settings = await store.get_pattern("users.123.settings.*")
# Delete entire user
await store.delete_pattern("users.123.**")
```
## Pattern Matching
```python
# Wildcards
await store.keys("users.*.name") # All user names
await store.keys("*.config.*") # All config at any level
await store.keys("logs.2024-*") # All 2024 logs
# Recursive wildcards
await store.keys("app.**") # Everything under app
await store.keys("**.error") # All error keys at any level
# Single character and sets
await store.keys("user.?.name") # user.1.name, user.a.name, etc.
await store.keys("log.[0-9].txt") # log.0.txt through log.9.txt
# Alternatives
await store.keys("env.{dev,prod}.config") # dev or prod config
```
## Metadata Management
```python
# Set with metadata
await store.set(
"document.report.pdf",
document_bytes,
metadata={
"content_type": "application/pdf",
"author": "John Doe",
"created": "2024-01-15",
"tags": ["quarterly", "finance"],
"ttl": 86400 # Expire in 24 hours
}
)
# Get metadata
metadata = await store.get_metadata("document.report.pdf")
print(f"Author: {metadata['author']}")
print(f"Tags: {metadata['tags']}")
# Update metadata
await store.set_metadata("document.report.pdf", {
"reviewed": True,
"reviewer": "Jane Smith"
})
```
## TTL and Expiration
```python
# Set with TTL (time to live)
await store.set("session.abc123", session_data, ttl=3600) # 1 hour
await store.set("cache.results", results, ttl=300) # 5 minutes
# Check if key exists (returns False if expired)
if await store.exists("session.abc123"):
data = await store.get("session.abc123")
# Manual cleanup of expired keys
expired_count = await store.cleanup_expired()
print(f"Removed {expired_count} expired keys")
# Automatic cleanup (runs in background)
store = KVStore(backend="memory", auto_cleanup=True, cleanup_interval=60)
```
## Atomic Operations
```python
# Compare and swap (CAS)
success = await store.compare_and_swap(
"version",
old_value="1.0.0",
new_value="1.1.0"
)
# Increment/decrement counters
views = await store.increment("stats.page_views")
remaining = await store.increment("inventory.item_123", delta=-1)
# Append to strings
await store.append("logs.access", "\n2024-01-15 10:30 User login")
await store.append("notes", ", Remember to update docs")
```
## Batch Operations
```python
# Set multiple values
await store.set_many({
"config.host": "localhost",
"config.port": 8080,
"config.ssl": True,
"config.timeout": 30
})
# Get multiple values
values = await store.get_many([
"config.host",
"config.port",
"config.ssl"
])
# Delete multiple keys
deleted = await store.delete_many([
"temp.file1",
"temp.file2",
"cache.old"
])
```
## Namespace Isolation
```python
# Create isolated namespaces
user_store = KVStore(backend="postgres", namespace="user_data")
system_store = KVStore(backend="postgres", namespace="system")
cache_store = KVStore(backend="memory", namespace="cache")
# Same keys, different namespaces
await user_store.set("settings", {"theme": "dark"})
await system_store.set("settings", {"debug": True})
# Values are isolated
user_settings = await user_store.get("settings") # {"theme": "dark"}
system_settings = await system_store.get("settings") # {"debug": True}
```
## Backend Configuration
### Memory Backend
```python
store = KVStore(backend="memory")
```
### File Backend
```python
store = KVStore(backend="file", config={
"path": "/data/kv_store.json",
"format": "json",
"compression": "gzip"
})
```
### PostgreSQL Backend
```python
store = KVStore(backend="postgres", config={
"host": "localhost",
"database": "myapp",
"table": "kv_store",
"user": "dbuser",
"password": "dbpass"
})
```
### Redis Backend
```python
store = KVStore(backend="redis", config={
"host": "localhost",
"port": 6379,
"db": 0,
"password": "optional"
})
```
### S3 Backend
```python
store = KVStore(backend="s3", config={
"bucket": "my-kv-store",
"prefix": "data/",
"region": "us-west-2"
})
```
## Advanced Usage
### Custom Serialization
```python
import pickle
# Custom serializer for complex objects
class CustomStore(KVStore):
def serialize(self, value):
if isinstance(value, MyComplexClass):
return pickle.dumps(value)
return super().serialize(value)
def deserialize(self, data, value_type):
if value_type == "custom":
return pickle.loads(data)
return super().deserialize(data, value_type)
```
### Caching Layer
```python
# Add caching for performance
store = KVStore(
backend="postgres",
cache="memory",
cache_ttl=60, # Cache for 60 seconds
cache_size=1000 # Max 1000 items in cache
)
```
### Migration Between Backends
```python
from dataknobs_kv import migrate_store
# Migrate from file to PostgreSQL
source = KVStore(backend="file", config={"path": "data.json"})
target = KVStore(backend="postgres", config=postgres_config)
await migrate_store(source, target, batch_size=100)
```
## Development
```bash
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Run tests with coverage
pytest --cov=dataknobs_kv
# Type checking
mypy src/dataknobs_kv
# Linting
ruff check src/dataknobs_kv
# Format code
black src/dataknobs_kv
```
## Architecture
The package is built on top of dataknobs-data, providing:
- **Keys**: Hierarchical path management with validation
- **Values**: Type-safe serialization/deserialization
- **Patterns**: Advanced pattern matching engine
- **Metadata**: Rich metadata with system and user fields
- **Namespaces**: Isolation for multi-tenant applications
- **Backends**: Leverages dataknobs-data storage backends
## Performance
- Optimized pattern matching with compiled regex
- Efficient batch operations
- Optional caching layer
- Connection pooling for database backends
- Async/await support for concurrent operations
## Contributing
Contributions are welcome! Please see our [Contributing Guide](../../CONTRIBUTING.md) for details.
## License
This project is licensed under the MIT License - see the [LICENSE](../../LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "dataknobs-kv",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "cache, database, key-value, kv, storage, store",
"author": null,
"author_email": "DataKnobs Team <team@dataknobs.com>",
"download_url": "https://files.pythonhosted.org/packages/3d/4f/403510416ea8a57c935cc58585bc6b731b577573c6d96b84e81479cb1111/dataknobs_kv-0.1.0.tar.gz",
"platform": null,
"description": "# DataKnobs KV Package\n\nA powerful key/value store abstraction with hierarchical keys, pattern matching, and rich metadata support, built on top of the dataknobs-data package.\n\n## Overview\n\nThe `dataknobs-kv` package provides a simple yet powerful interface for managing key/value pairs across multiple storage backends. It features hierarchical key structures, advanced pattern matching, metadata management, and seamless integration with various storage technologies through the dataknobs-data abstraction layer.\n\n## Features\n\n- **Hierarchical Keys**: Dot-notation key paths for organized data structure\n- **Pattern Matching**: Powerful glob-style patterns for key operations\n- **Rich Metadata**: Associate custom metadata with every key/value pair\n- **Multiple Backends**: Memory, File, PostgreSQL, Elasticsearch, S3, and more\n- **TTL Support**: Automatic expiration for time-sensitive data\n- **Atomic Operations**: Compare-and-swap, increment, append operations\n- **Namespace Isolation**: Multi-tenant support with isolated namespaces\n- **Type Safety**: Strong typing with automatic serialization/deserialization\n\n## Installation\n\n```bash\n# Basic installation\npip install dataknobs-kv\n\n# With caching support\npip install dataknobs-kv[cache]\n\n# With all features\npip install dataknobs-kv[all]\n```\n\n## Quick Start\n\n```python\nfrom dataknobs_kv import KVStore\n\n# Create a store with memory backend\nstore = KVStore(backend=\"memory\")\n\n# Basic operations\nawait store.set(\"app.name\", \"MyApplication\")\nawait store.set(\"app.version\", \"1.0.0\")\nawait store.set(\"app.config.debug\", True)\n\n# Get values\nname = await store.get(\"app.name\")\ndebug = await store.get(\"app.config.debug\")\n\n# Pattern matching\napp_keys = await store.keys(\"app.*\")\nall_config = await store.get_pattern(\"app.config.*\")\n\n# Delete keys\nawait store.delete(\"app.config.debug\")\nawait store.delete_pattern(\"app.temp.*\")\n```\n\n## Hierarchical Key Structure\n\n```python\n# Organize data with dot-notation paths\nawait store.set(\"users.123.name\", \"John Doe\")\nawait store.set(\"users.123.email\", \"john@example.com\")\nawait store.set(\"users.123.settings.theme\", \"dark\")\nawait store.set(\"users.123.settings.notifications\", True)\n\n# Get all user data\nuser_data = await store.get_pattern(\"users.123.*\")\n\n# Get all user settings\nuser_settings = await store.get_pattern(\"users.123.settings.*\")\n\n# Delete entire user\nawait store.delete_pattern(\"users.123.**\")\n```\n\n## Pattern Matching\n\n```python\n# Wildcards\nawait store.keys(\"users.*.name\") # All user names\nawait store.keys(\"*.config.*\") # All config at any level\nawait store.keys(\"logs.2024-*\") # All 2024 logs\n\n# Recursive wildcards\nawait store.keys(\"app.**\") # Everything under app\nawait store.keys(\"**.error\") # All error keys at any level\n\n# Single character and sets\nawait store.keys(\"user.?.name\") # user.1.name, user.a.name, etc.\nawait store.keys(\"log.[0-9].txt\") # log.0.txt through log.9.txt\n\n# Alternatives\nawait store.keys(\"env.{dev,prod}.config\") # dev or prod config\n```\n\n## Metadata Management\n\n```python\n# Set with metadata\nawait store.set(\n \"document.report.pdf\",\n document_bytes,\n metadata={\n \"content_type\": \"application/pdf\",\n \"author\": \"John Doe\",\n \"created\": \"2024-01-15\",\n \"tags\": [\"quarterly\", \"finance\"],\n \"ttl\": 86400 # Expire in 24 hours\n }\n)\n\n# Get metadata\nmetadata = await store.get_metadata(\"document.report.pdf\")\nprint(f\"Author: {metadata['author']}\")\nprint(f\"Tags: {metadata['tags']}\")\n\n# Update metadata\nawait store.set_metadata(\"document.report.pdf\", {\n \"reviewed\": True,\n \"reviewer\": \"Jane Smith\"\n})\n```\n\n## TTL and Expiration\n\n```python\n# Set with TTL (time to live)\nawait store.set(\"session.abc123\", session_data, ttl=3600) # 1 hour\nawait store.set(\"cache.results\", results, ttl=300) # 5 minutes\n\n# Check if key exists (returns False if expired)\nif await store.exists(\"session.abc123\"):\n data = await store.get(\"session.abc123\")\n\n# Manual cleanup of expired keys\nexpired_count = await store.cleanup_expired()\nprint(f\"Removed {expired_count} expired keys\")\n\n# Automatic cleanup (runs in background)\nstore = KVStore(backend=\"memory\", auto_cleanup=True, cleanup_interval=60)\n```\n\n## Atomic Operations\n\n```python\n# Compare and swap (CAS)\nsuccess = await store.compare_and_swap(\n \"version\",\n old_value=\"1.0.0\",\n new_value=\"1.1.0\"\n)\n\n# Increment/decrement counters\nviews = await store.increment(\"stats.page_views\")\nremaining = await store.increment(\"inventory.item_123\", delta=-1)\n\n# Append to strings\nawait store.append(\"logs.access\", \"\\n2024-01-15 10:30 User login\")\nawait store.append(\"notes\", \", Remember to update docs\")\n```\n\n## Batch Operations\n\n```python\n# Set multiple values\nawait store.set_many({\n \"config.host\": \"localhost\",\n \"config.port\": 8080,\n \"config.ssl\": True,\n \"config.timeout\": 30\n})\n\n# Get multiple values\nvalues = await store.get_many([\n \"config.host\",\n \"config.port\",\n \"config.ssl\"\n])\n\n# Delete multiple keys\ndeleted = await store.delete_many([\n \"temp.file1\",\n \"temp.file2\",\n \"cache.old\"\n])\n```\n\n## Namespace Isolation\n\n```python\n# Create isolated namespaces\nuser_store = KVStore(backend=\"postgres\", namespace=\"user_data\")\nsystem_store = KVStore(backend=\"postgres\", namespace=\"system\")\ncache_store = KVStore(backend=\"memory\", namespace=\"cache\")\n\n# Same keys, different namespaces\nawait user_store.set(\"settings\", {\"theme\": \"dark\"})\nawait system_store.set(\"settings\", {\"debug\": True})\n\n# Values are isolated\nuser_settings = await user_store.get(\"settings\") # {\"theme\": \"dark\"}\nsystem_settings = await system_store.get(\"settings\") # {\"debug\": True}\n```\n\n## Backend Configuration\n\n### Memory Backend\n```python\nstore = KVStore(backend=\"memory\")\n```\n\n### File Backend\n```python\nstore = KVStore(backend=\"file\", config={\n \"path\": \"/data/kv_store.json\",\n \"format\": \"json\",\n \"compression\": \"gzip\"\n})\n```\n\n### PostgreSQL Backend\n```python\nstore = KVStore(backend=\"postgres\", config={\n \"host\": \"localhost\",\n \"database\": \"myapp\",\n \"table\": \"kv_store\",\n \"user\": \"dbuser\",\n \"password\": \"dbpass\"\n})\n```\n\n### Redis Backend\n```python\nstore = KVStore(backend=\"redis\", config={\n \"host\": \"localhost\",\n \"port\": 6379,\n \"db\": 0,\n \"password\": \"optional\"\n})\n```\n\n### S3 Backend\n```python\nstore = KVStore(backend=\"s3\", config={\n \"bucket\": \"my-kv-store\",\n \"prefix\": \"data/\",\n \"region\": \"us-west-2\"\n})\n```\n\n## Advanced Usage\n\n### Custom Serialization\n```python\nimport pickle\n\n# Custom serializer for complex objects\nclass CustomStore(KVStore):\n def serialize(self, value):\n if isinstance(value, MyComplexClass):\n return pickle.dumps(value)\n return super().serialize(value)\n \n def deserialize(self, data, value_type):\n if value_type == \"custom\":\n return pickle.loads(data)\n return super().deserialize(data, value_type)\n```\n\n### Caching Layer\n```python\n# Add caching for performance\nstore = KVStore(\n backend=\"postgres\",\n cache=\"memory\",\n cache_ttl=60, # Cache for 60 seconds\n cache_size=1000 # Max 1000 items in cache\n)\n```\n\n### Migration Between Backends\n```python\nfrom dataknobs_kv import migrate_store\n\n# Migrate from file to PostgreSQL\nsource = KVStore(backend=\"file\", config={\"path\": \"data.json\"})\ntarget = KVStore(backend=\"postgres\", config=postgres_config)\n\nawait migrate_store(source, target, batch_size=100)\n```\n\n## Development\n\n```bash\n# Install development dependencies\npip install -e \".[dev]\"\n\n# Run tests\npytest\n\n# Run tests with coverage\npytest --cov=dataknobs_kv\n\n# Type checking\nmypy src/dataknobs_kv\n\n# Linting\nruff check src/dataknobs_kv\n\n# Format code\nblack src/dataknobs_kv\n```\n\n## Architecture\n\nThe package is built on top of dataknobs-data, providing:\n\n- **Keys**: Hierarchical path management with validation\n- **Values**: Type-safe serialization/deserialization\n- **Patterns**: Advanced pattern matching engine\n- **Metadata**: Rich metadata with system and user fields\n- **Namespaces**: Isolation for multi-tenant applications\n- **Backends**: Leverages dataknobs-data storage backends\n\n## Performance\n\n- Optimized pattern matching with compiled regex\n- Efficient batch operations\n- Optional caching layer\n- Connection pooling for database backends\n- Async/await support for concurrent operations\n\n## Contributing\n\nContributions are welcome! Please see our [Contributing Guide](../../CONTRIBUTING.md) for details.\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](../../LICENSE) file for details.",
"bugtrack_url": null,
"license": null,
"summary": "Key/value store abstraction with hierarchical keys, pattern matching, and metadata support",
"version": "0.1.0",
"project_urls": {
"Bug Tracker": "https://github.com/dataknobs/dataknobs/issues",
"Documentation": "https://dataknobs.readthedocs.io",
"Homepage": "https://github.com/dataknobs/dataknobs"
},
"split_keywords": [
"cache",
" database",
" key-value",
" kv",
" storage",
" store"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "e89a7e58c5d713cdeea72c3965e20523eebbaab6e0f957e3f66d370984fc6aab",
"md5": "fd66642c8cdc21e03013adc4c5998f2d",
"sha256": "897d86c8fb6dbc4b3b085776b2363a0d4d7de220203cbb6d0449056f8c0fb02d"
},
"downloads": -1,
"filename": "dataknobs_kv-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "fd66642c8cdc21e03013adc4c5998f2d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 4243,
"upload_time": "2025-08-18T03:33:49",
"upload_time_iso_8601": "2025-08-18T03:33:49.577666Z",
"url": "https://files.pythonhosted.org/packages/e8/9a/7e58c5d713cdeea72c3965e20523eebbaab6e0f957e3f66d370984fc6aab/dataknobs_kv-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "3d4f403510416ea8a57c935cc58585bc6b731b577573c6d96b84e81479cb1111",
"md5": "e32d6183fa2be912a258c061c409184a",
"sha256": "252f4f198e370be47f58b6cfff043c12605d5a89e94bb87f7f941c0d346216e3"
},
"downloads": -1,
"filename": "dataknobs_kv-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "e32d6183fa2be912a258c061c409184a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 10104,
"upload_time": "2025-08-18T03:33:50",
"upload_time_iso_8601": "2025-08-18T03:33:50.864971Z",
"url": "https://files.pythonhosted.org/packages/3d/4f/403510416ea8a57c935cc58585bc6b731b577573c6d96b84e81479cb1111/dataknobs_kv-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-18 03:33:50",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dataknobs",
"github_project": "dataknobs",
"github_not_found": true,
"lcname": "dataknobs-kv"
}