Name | libcachesim JSON |
Version |
0.3.2
JSON |
| download |
home_page | None |
Summary | Python bindings for libCacheSim |
upload_time | 2025-07-14 08:12:45 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | None |
keywords |
performance
cache
simulator
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# libCacheSim Python Binding
Python bindings for libCacheSim, a high-performance cache simulator and analysis library.
## Installation
Binary installers for the latest released version are available at the [Python Package Index (PyPI)](https://pypi.org/project/libcachesim).
```bash
pip install libcachesim
```
### Installation from sources
If there are no wheels suitable for your environment, consider building from source.
```bash
git clone https://github.com/1a1a11a/libCacheSim.git
cd libCacheSim
# Build the main libCacheSim library first
cmake -G Ninja -B build
ninja -C build
# Install Python binding
cd libCacheSim-python
pip install -e .
```
### Testing
```bash
# Run all tests
python -m pytest .
# Test import
python -c "import libcachesim; print('Success!')"
```
## Quick Start
### Basic Usage
```python
import libcachesim as lcs
# Create a cache
cache = lcs.LRU(cache_size=1024*1024) # 1MB cache
# Process requests
req = lcs.Request()
req.obj_id = 1
req.obj_size = 100
print(cache.get(req)) # False (first access)
print(cache.get(req)) # True (second access)
```
### Trace Processing
```python
import libcachesim as lcs
# Open trace and process efficiently
reader = lcs.open_trace("./data/cloudPhysicsIO.oracleGeneral.bin", lcs.TraceType.ORACLE_GENERAL_TRACE)
cache = lcs.S3FIFO(cache_size=1024*1024)
# Process entire trace efficiently (C++ backend)
miss_ratio = cache.process_trace(reader)
print(f"Miss ratio: {miss_ratio:.4f}")
# Process with limits and time ranges
miss_ratio = cache.process_trace(
reader,
start_req=100,
max_req=1000
)
print(f"Miss ratio: {miss_ratio:.4f}")
```
## Custom Cache Policies
Implement custom cache replacement algorithms using pure Python functions - **no C/C++ compilation required**.
### Python Hook Cache Overview
The `PythonHookCachePolicy` allows you to define custom caching behavior through Python callback functions. This is perfect for:
- Prototyping new cache algorithms
- Educational purposes and learning
- Research and experimentation
- Custom business logic implementation
### Hook Functions
You need to implement these callback functions:
- **`init_hook(cache_size: int) -> Any`**: Initialize your data structure
- **`hit_hook(data: Any, obj_id: int, obj_size: int) -> None`**: Handle cache hits
- **`miss_hook(data: Any, obj_id: int, obj_size: int) -> None`**: Handle cache misses
- **`eviction_hook(data: Any, obj_id: int, obj_size: int) -> int`**: Return object ID to evict
- **`remove_hook(data: Any, obj_id: int) -> None`**: Clean up when object removed
- **`free_hook(data: Any) -> None`**: [Optional] Final cleanup
### Example: Custom LRU Implementation
```python
import libcachesim as lcs
from collections import OrderedDict
# Create a Python hook-based cache
cache = lcs.PythonHookCachePolicy(cache_size=1024*1024, cache_name="MyLRU")
# Define LRU policy hooks
def init_hook(cache_size):
return OrderedDict() # Track access order
def hit_hook(lru_dict, obj_id, obj_size):
lru_dict.move_to_end(obj_id) # Move to most recent
def miss_hook(lru_dict, obj_id, obj_size):
lru_dict[obj_id] = True # Add to end
def eviction_hook(lru_dict, obj_id, obj_size):
return next(iter(lru_dict)) # Return least recent
def remove_hook(lru_dict, obj_id):
lru_dict.pop(obj_id, None)
# Set the hooks
cache.set_hooks(init_hook, hit_hook, miss_hook, eviction_hook, remove_hook)
# Use it like any other cache
req = lcs.Request()
req.obj_id = 1
req.obj_size = 100
hit = cache.get(req)
print(f"Cache hit: {hit}") # Should be False (miss)
```
### Example: Custom FIFO Implementation
```python
import libcachesim as lcs
from collections import deque
# Create a custom FIFO cache
cache = lcs.PythonHookCachePolicy(cache_size=1024, cache_name="CustomFIFO")
def init_hook(cache_size):
return deque() # Use deque for FIFO order
def hit_hook(fifo_queue, obj_id, obj_size):
pass # FIFO doesn't reorder on hit
def miss_hook(fifo_queue, obj_id, obj_size):
fifo_queue.append(obj_id) # Add to end of queue
def eviction_hook(fifo_queue, obj_id, obj_size):
return fifo_queue[0] # Return first item (oldest)
def remove_hook(fifo_queue, obj_id):
if fifo_queue and fifo_queue[0] == obj_id:
fifo_queue.popleft()
# Set the hooks and test
cache.set_hooks(init_hook, hit_hook, miss_hook, eviction_hook, remove_hook)
req = lcs.Request()
req.obj_id = 1
req.obj_size = 100
hit = cache.get(req)
print(f"Cache hit: {hit}") # Should be False (miss)
```
## Available Algorithms
### Built-in Cache Algorithms
#### Basic Algorithms
- **FIFO**: First-In-First-Out
- **LRU**: Least Recently Used
- **LFU**: Least Frequently Used
- **Clock**: Clock/Second-chance algorithm
#### Advanced Algorithms
- **S3FIFO**: Simple, Fast, Fair FIFO (recommended for most workloads)
- **Sieve**: High-performance eviction algorithm
- **ARC**: Adaptive Replacement Cache
- **TwoQ**: Two-Queue algorithm
- **TinyLFU**: TinyLFU with window
- **SLRU**: Segmented LRU
#### Research/ML Algorithms
- **LRB**: Learning-based cache (if enabled)
- **GLCache**: Machine learning-based cache
- **ThreeLCache**: Three-level cache hierarchy (if enabled)
```python
import libcachesim as lcs
# All algorithms use the same unified interface
cache_size = 1024 * 1024 # 1MB
lru_cache = lcs.LRU(cache_size)
s3fifo_cache = lcs.S3FIFO(cache_size) # Recommended
sieve_cache = lcs.Sieve(cache_size)
arc_cache = lcs.ARC(cache_size)
# All caches work identically
req = lcs.Request()
req.obj_id = 1
req.obj_size = 100
hit = lru_cache.get(req)
```
## Examples and Testing
### Algorithm Comparison
```python
import libcachesim as lcs
def compare_algorithms(trace_path):
reader = lcs.open_trace(trace_path, lcs.TraceType.VSCSI_TRACE)
algorithms = ['LRU', 'S3FIFO', 'Sieve', 'ARC']
for algo_name in algorithms:
cache = getattr(lcs, algo_name)(cache_size=1024*1024)
miss_ratio = cache.process_trace(reader)
print(f"{algo_name}\t\t{miss_ratio:.4f}")
compare_algorithms("./data/cloudPhysicsIO.vscsi")
```
### Performance Benchmarking
```python
import time
def benchmark_cache(cache, num_requests=100000):
"""Benchmark cache performance"""
start_time = time.time()
for i in range(num_requests):
req = lcs.Request()
req.obj_id = i % 1000 # Working set of 1000 objects
req.obj_size = 100
cache.get(req)
end_time = time.time()
throughput = num_requests / (end_time - start_time)
print(f"Processed {num_requests} requests in {end_time - start_time:.2f}s")
print(f"Throughput: {throughput:.0f} requests/sec")
# Compare performance
lru_cache = lcs.LRU(cache_size=1024*1024)
s3fifo_cache = lcs.S3FIFO(cache_size=1024*1024)
print("LRU Performance:")
benchmark_cache(lru_cache)
print("\nS3FIFO Performance:")
benchmark_cache(s3fifo_cache)
```
## Advanced Usage
### Multi-Format Trace Processing
```python
import libcachesim as lcs
# Supported trace types
trace_types = {
"oracle": lcs.TraceType.ORACLE_GENERAL_TRACE,
"csv": lcs.TraceType.CSV_TRACE,
"vscsi": lcs.TraceType.VSCSI_TRACE,
"txt": lcs.TraceType.PLAIN_TXT_TRACE
}
# Open different trace formats
oracle_reader = lcs.open_trace("./data/cloudPhysicsIO.oracleGeneral.bin", trace_types["oracle"])
csv_reader = lcs.open_trace("./data/cloudPhysicsIO.txt", trace_types["txt"])
# Process traces with different caches
caches = [
lcs.LRU(cache_size=1024*1024),
lcs.S3FIFO(cache_size=1024*1024),
lcs.Sieve(cache_size=1024*1024)
]
for i, cache in enumerate(caches):
miss_ratio_oracle = cache.process_trace(oracle_reader)
miss_ratio_csv = cache.process_trace(csv_reader)
print(f"Cache {i} miss ratio: {miss_ratio_oracle:.4f}, {miss_ratio_csv:.4f}")
```
## Troubleshooting
### Common Issues
**Import Error**: Make sure libCacheSim C++ library is built first:
```bash
cmake -G Ninja -B build && ninja -C build
```
**Performance Issues**: Use `process_trace()` for large workloads instead of individual `get()` calls for better performance.
**Memory Usage**: Monitor cache statistics (`cache.occupied_byte`) and ensure proper cache size limits for your system.
**Custom Cache Issues**: Validate your custom implementation against built-in algorithms using the test functions above.
### Getting Help
- Check the [main documentation](../doc/) for detailed guides
- Open issues on [GitHub](https://github.com/1a1a11a/libCacheSim/issues)
- Review [examples](/example) in the main repository
Raw data
{
"_id": null,
"home_page": null,
"name": "libcachesim",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "performance, cache, simulator",
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/4a/e6/260402354689fc505b6f2be40b89aa299bc37e5e874bb7cbea3242dfaf9a/libcachesim-0.3.2.tar.gz",
"platform": null,
"description": "# libCacheSim Python Binding\n\nPython bindings for libCacheSim, a high-performance cache simulator and analysis library.\n\n## Installation\n\nBinary installers for the latest released version are available at the [Python Package Index (PyPI)](https://pypi.org/project/libcachesim).\n\n```bash\npip install libcachesim\n```\n\n### Installation from sources\n\nIf there are no wheels suitable for your environment, consider building from source.\n\n```bash\ngit clone https://github.com/1a1a11a/libCacheSim.git\ncd libCacheSim\n\n# Build the main libCacheSim library first\ncmake -G Ninja -B build\nninja -C build\n\n# Install Python binding\ncd libCacheSim-python\npip install -e .\n```\n\n### Testing\n```bash\n# Run all tests\npython -m pytest .\n\n# Test import\npython -c \"import libcachesim; print('Success!')\"\n```\n\n## Quick Start\n\n### Basic Usage\n\n```python\nimport libcachesim as lcs\n\n# Create a cache\ncache = lcs.LRU(cache_size=1024*1024) # 1MB cache\n\n# Process requests\nreq = lcs.Request()\nreq.obj_id = 1\nreq.obj_size = 100\n\nprint(cache.get(req)) # False (first access)\nprint(cache.get(req)) # True (second access)\n```\n\n### Trace Processing\n\n```python\nimport libcachesim as lcs\n\n# Open trace and process efficiently\nreader = lcs.open_trace(\"./data/cloudPhysicsIO.oracleGeneral.bin\", lcs.TraceType.ORACLE_GENERAL_TRACE)\ncache = lcs.S3FIFO(cache_size=1024*1024)\n\n# Process entire trace efficiently (C++ backend)\nmiss_ratio = cache.process_trace(reader)\nprint(f\"Miss ratio: {miss_ratio:.4f}\")\n\n# Process with limits and time ranges\nmiss_ratio = cache.process_trace(\n reader,\n start_req=100,\n max_req=1000\n)\nprint(f\"Miss ratio: {miss_ratio:.4f}\")\n```\n\n## Custom Cache Policies\n\nImplement custom cache replacement algorithms using pure Python functions - **no C/C++ compilation required**.\n\n### Python Hook Cache Overview\n\nThe `PythonHookCachePolicy` allows you to define custom caching behavior through Python callback functions. This is perfect for:\n- Prototyping new cache algorithms\n- Educational purposes and learning\n- Research and experimentation\n- Custom business logic implementation\n\n### Hook Functions\n\nYou need to implement these callback functions:\n\n- **`init_hook(cache_size: int) -> Any`**: Initialize your data structure\n- **`hit_hook(data: Any, obj_id: int, obj_size: int) -> None`**: Handle cache hits\n- **`miss_hook(data: Any, obj_id: int, obj_size: int) -> None`**: Handle cache misses\n- **`eviction_hook(data: Any, obj_id: int, obj_size: int) -> int`**: Return object ID to evict\n- **`remove_hook(data: Any, obj_id: int) -> None`**: Clean up when object removed\n- **`free_hook(data: Any) -> None`**: [Optional] Final cleanup\n\n### Example: Custom LRU Implementation\n\n```python\nimport libcachesim as lcs\nfrom collections import OrderedDict\n\n# Create a Python hook-based cache\ncache = lcs.PythonHookCachePolicy(cache_size=1024*1024, cache_name=\"MyLRU\")\n\n# Define LRU policy hooks\ndef init_hook(cache_size):\n return OrderedDict() # Track access order\n\ndef hit_hook(lru_dict, obj_id, obj_size):\n lru_dict.move_to_end(obj_id) # Move to most recent\n\ndef miss_hook(lru_dict, obj_id, obj_size):\n lru_dict[obj_id] = True # Add to end\n\ndef eviction_hook(lru_dict, obj_id, obj_size):\n return next(iter(lru_dict)) # Return least recent\n\ndef remove_hook(lru_dict, obj_id):\n lru_dict.pop(obj_id, None)\n\n# Set the hooks\ncache.set_hooks(init_hook, hit_hook, miss_hook, eviction_hook, remove_hook)\n\n# Use it like any other cache\nreq = lcs.Request()\nreq.obj_id = 1\nreq.obj_size = 100\nhit = cache.get(req)\nprint(f\"Cache hit: {hit}\") # Should be False (miss)\n```\n\n### Example: Custom FIFO Implementation\n\n```python\nimport libcachesim as lcs\nfrom collections import deque\n\n# Create a custom FIFO cache\ncache = lcs.PythonHookCachePolicy(cache_size=1024, cache_name=\"CustomFIFO\")\n\ndef init_hook(cache_size):\n return deque() # Use deque for FIFO order\n\ndef hit_hook(fifo_queue, obj_id, obj_size):\n pass # FIFO doesn't reorder on hit\n\ndef miss_hook(fifo_queue, obj_id, obj_size):\n fifo_queue.append(obj_id) # Add to end of queue\n\ndef eviction_hook(fifo_queue, obj_id, obj_size):\n return fifo_queue[0] # Return first item (oldest)\n\ndef remove_hook(fifo_queue, obj_id):\n if fifo_queue and fifo_queue[0] == obj_id:\n fifo_queue.popleft()\n\n# Set the hooks and test\ncache.set_hooks(init_hook, hit_hook, miss_hook, eviction_hook, remove_hook)\n\nreq = lcs.Request()\nreq.obj_id = 1\nreq.obj_size = 100\nhit = cache.get(req)\nprint(f\"Cache hit: {hit}\") # Should be False (miss)\n```\n\n## Available Algorithms\n\n### Built-in Cache Algorithms\n\n#### Basic Algorithms\n- **FIFO**: First-In-First-Out\n- **LRU**: Least Recently Used\n- **LFU**: Least Frequently Used\n- **Clock**: Clock/Second-chance algorithm\n\n#### Advanced Algorithms\n- **S3FIFO**: Simple, Fast, Fair FIFO (recommended for most workloads)\n- **Sieve**: High-performance eviction algorithm\n- **ARC**: Adaptive Replacement Cache\n- **TwoQ**: Two-Queue algorithm\n- **TinyLFU**: TinyLFU with window\n- **SLRU**: Segmented LRU\n\n#### Research/ML Algorithms\n- **LRB**: Learning-based cache (if enabled)\n- **GLCache**: Machine learning-based cache\n- **ThreeLCache**: Three-level cache hierarchy (if enabled)\n\n```python\nimport libcachesim as lcs\n\n# All algorithms use the same unified interface\ncache_size = 1024 * 1024 # 1MB\n\nlru_cache = lcs.LRU(cache_size)\ns3fifo_cache = lcs.S3FIFO(cache_size) # Recommended\nsieve_cache = lcs.Sieve(cache_size)\narc_cache = lcs.ARC(cache_size)\n\n# All caches work identically\nreq = lcs.Request()\nreq.obj_id = 1\nreq.obj_size = 100\nhit = lru_cache.get(req)\n```\n\n## Examples and Testing\n\n### Algorithm Comparison\n```python\nimport libcachesim as lcs\n\ndef compare_algorithms(trace_path):\n reader = lcs.open_trace(trace_path, lcs.TraceType.VSCSI_TRACE)\n algorithms = ['LRU', 'S3FIFO', 'Sieve', 'ARC']\n for algo_name in algorithms:\n cache = getattr(lcs, algo_name)(cache_size=1024*1024)\n miss_ratio = cache.process_trace(reader)\n print(f\"{algo_name}\\t\\t{miss_ratio:.4f}\")\n\ncompare_algorithms(\"./data/cloudPhysicsIO.vscsi\")\n```\n\n### Performance Benchmarking\n```python\nimport time\n\ndef benchmark_cache(cache, num_requests=100000):\n \"\"\"Benchmark cache performance\"\"\"\n start_time = time.time()\n for i in range(num_requests):\n req = lcs.Request()\n req.obj_id = i % 1000 # Working set of 1000 objects\n req.obj_size = 100\n cache.get(req)\n end_time = time.time()\n throughput = num_requests / (end_time - start_time)\n print(f\"Processed {num_requests} requests in {end_time - start_time:.2f}s\")\n print(f\"Throughput: {throughput:.0f} requests/sec\")\n\n# Compare performance\nlru_cache = lcs.LRU(cache_size=1024*1024)\ns3fifo_cache = lcs.S3FIFO(cache_size=1024*1024)\n\nprint(\"LRU Performance:\")\nbenchmark_cache(lru_cache)\n\nprint(\"\\nS3FIFO Performance:\")\nbenchmark_cache(s3fifo_cache)\n```\n\n## Advanced Usage\n\n### Multi-Format Trace Processing\n\n```python\nimport libcachesim as lcs\n\n# Supported trace types\ntrace_types = {\n \"oracle\": lcs.TraceType.ORACLE_GENERAL_TRACE,\n \"csv\": lcs.TraceType.CSV_TRACE,\n \"vscsi\": lcs.TraceType.VSCSI_TRACE,\n \"txt\": lcs.TraceType.PLAIN_TXT_TRACE\n}\n\n# Open different trace formats\noracle_reader = lcs.open_trace(\"./data/cloudPhysicsIO.oracleGeneral.bin\", trace_types[\"oracle\"])\ncsv_reader = lcs.open_trace(\"./data/cloudPhysicsIO.txt\", trace_types[\"txt\"])\n\n# Process traces with different caches\ncaches = [\n lcs.LRU(cache_size=1024*1024),\n lcs.S3FIFO(cache_size=1024*1024),\n lcs.Sieve(cache_size=1024*1024)\n]\n\nfor i, cache in enumerate(caches):\n miss_ratio_oracle = cache.process_trace(oracle_reader)\n miss_ratio_csv = cache.process_trace(csv_reader)\n print(f\"Cache {i} miss ratio: {miss_ratio_oracle:.4f}, {miss_ratio_csv:.4f}\")\n```\n\n## Troubleshooting\n\n### Common Issues\n\n**Import Error**: Make sure libCacheSim C++ library is built first:\n```bash\ncmake -G Ninja -B build && ninja -C build\n```\n\n**Performance Issues**: Use `process_trace()` for large workloads instead of individual `get()` calls for better performance.\n\n**Memory Usage**: Monitor cache statistics (`cache.occupied_byte`) and ensure proper cache size limits for your system.\n\n**Custom Cache Issues**: Validate your custom implementation against built-in algorithms using the test functions above.\n\n### Getting Help\n\n- Check the [main documentation](../doc/) for detailed guides\n- Open issues on [GitHub](https://github.com/1a1a11a/libCacheSim/issues)\n- Review [examples](/example) in the main repository\n",
"bugtrack_url": null,
"license": null,
"summary": "Python bindings for libCacheSim",
"version": "0.3.2",
"project_urls": null,
"split_keywords": [
"performance",
" cache",
" simulator"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "3ceac552afc168125577836d23350877be68c6edf0bddbfcb3309f831d186ec8",
"md5": "2f82cb9f93af116d53f263253024f0b6",
"sha256": "de6fac00d2c9d6b53f2dcb1a0cd3a19d9e545a8e50458f4265787ef2e973a5da"
},
"downloads": -1,
"filename": "libcachesim-0.3.2-cp310-cp310-manylinux_2_34_x86_64.whl",
"has_sig": false,
"md5_digest": "2f82cb9f93af116d53f263253024f0b6",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.9",
"size": 549980,
"upload_time": "2025-07-14T08:12:38",
"upload_time_iso_8601": "2025-07-14T08:12:38.310077Z",
"url": "https://files.pythonhosted.org/packages/3c/ea/c552afc168125577836d23350877be68c6edf0bddbfcb3309f831d186ec8/libcachesim-0.3.2-cp310-cp310-manylinux_2_34_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "7d49ba3852f2c8cf4aadbd2dca72518e896e1a0e8be24cb5815fa23bf99f8885",
"md5": "9557a98a1542c3b1fb01111591ab4ce9",
"sha256": "ab48b71ed5db88438a654772dcf356463816a2218c399f78bfc345badccfb606"
},
"downloads": -1,
"filename": "libcachesim-0.3.2-cp311-cp311-manylinux_2_34_x86_64.whl",
"has_sig": false,
"md5_digest": "9557a98a1542c3b1fb01111591ab4ce9",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.9",
"size": 550859,
"upload_time": "2025-07-14T08:12:39",
"upload_time_iso_8601": "2025-07-14T08:12:39.909480Z",
"url": "https://files.pythonhosted.org/packages/7d/49/ba3852f2c8cf4aadbd2dca72518e896e1a0e8be24cb5815fa23bf99f8885/libcachesim-0.3.2-cp311-cp311-manylinux_2_34_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "c552c553b4f9782200044f2b0c86d5f93c8109e6be62beb13df5f0fd19e3276d",
"md5": "5e3b022246671aed213de4c4a0e18625",
"sha256": "0d9d6c1e0271b7348c90765c00cb3f866e41e58328ef0da9638fe33868d14273"
},
"downloads": -1,
"filename": "libcachesim-0.3.2-cp312-cp312-manylinux_2_34_x86_64.whl",
"has_sig": false,
"md5_digest": "5e3b022246671aed213de4c4a0e18625",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.9",
"size": 554393,
"upload_time": "2025-07-14T08:12:41",
"upload_time_iso_8601": "2025-07-14T08:12:41.264862Z",
"url": "https://files.pythonhosted.org/packages/c5/52/c553b4f9782200044f2b0c86d5f93c8109e6be62beb13df5f0fd19e3276d/libcachesim-0.3.2-cp312-cp312-manylinux_2_34_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "6ecfbf3e33a291b551d88722b57b456db85483c6946a3eb397cd7bbdc6ce513a",
"md5": "359360ec72872ef3a035b9d36de74436",
"sha256": "772a99b1632dc0895b2d34c4cc54bbe14715055f3df1815f82530dc63f703de3"
},
"downloads": -1,
"filename": "libcachesim-0.3.2-cp313-cp313-manylinux_2_34_x86_64.whl",
"has_sig": false,
"md5_digest": "359360ec72872ef3a035b9d36de74436",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": ">=3.9",
"size": 554075,
"upload_time": "2025-07-14T08:12:42",
"upload_time_iso_8601": "2025-07-14T08:12:42.554804Z",
"url": "https://files.pythonhosted.org/packages/6e/cf/bf3e33a291b551d88722b57b456db85483c6946a3eb397cd7bbdc6ce513a/libcachesim-0.3.2-cp313-cp313-manylinux_2_34_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "74d32e8dcd994684dd285f5e85d4648567db7e19103a9f08ec9956bc24acf42c",
"md5": "e2eff960eb662f88a481e52556091453",
"sha256": "c805a1e45bb8bfb77db18693913d1d549ed83c71c4abaf470382a6c2be023ce9"
},
"downloads": -1,
"filename": "libcachesim-0.3.2-cp39-cp39-manylinux_2_34_x86_64.whl",
"has_sig": false,
"md5_digest": "e2eff960eb662f88a481e52556091453",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 549963,
"upload_time": "2025-07-14T08:12:43",
"upload_time_iso_8601": "2025-07-14T08:12:43.861940Z",
"url": "https://files.pythonhosted.org/packages/74/d3/2e8dcd994684dd285f5e85d4648567db7e19103a9f08ec9956bc24acf42c/libcachesim-0.3.2-cp39-cp39-manylinux_2_34_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "4ae6260402354689fc505b6f2be40b89aa299bc37e5e874bb7cbea3242dfaf9a",
"md5": "40f79dba5356e5e15952295ef6cbd7bf",
"sha256": "f1fa8d136f6a8d4a203690db58d8cc09b62cc468daaf15ffeea021b141b00a52"
},
"downloads": -1,
"filename": "libcachesim-0.3.2.tar.gz",
"has_sig": false,
"md5_digest": "40f79dba5356e5e15952295ef6cbd7bf",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 28591,
"upload_time": "2025-07-14T08:12:45",
"upload_time_iso_8601": "2025-07-14T08:12:45.028829Z",
"url": "https://files.pythonhosted.org/packages/4a/e6/260402354689fc505b6f2be40b89aa299bc37e5e874bb7cbea3242dfaf9a/libcachesim-0.3.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-14 08:12:45",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "libcachesim"
}