# omnifetch
**Multi-source data retrieval with intelligent caching and storage backends**
Fetch data from anywhere, any type - with automatic TTL-based caching, multiple storage backends, and smart synchronization.
## ✨ Features
- ✅ **Multi-source data retrieval** - Local files, IBM Cloud Object Storage, or custom backends
- ✅ **Intelligent TTL-based caching** - Automatic freshness management
- ✅ **Concurrent access protection** - File locking for safe multi-process usage
- ✅ **Batch operations** - Efficient bulk data retrieval
- ✅ **Retry strategies** - Configurable retry logic for reliability
- ✅ **Multiple data formats** - Parquet, CSV, NetCDF, pickle, and more
- ✅ **Session caching** - In-memory cache for single-run optimization
- ✅ **Flexible configuration** - YAML-based or programmatic setup
## 📦 Installation
### From GitHub
```bash
pip install git+https://github.com/deinnovatie/omnifetch.git
```
### Local development
```bash
git clone https://github.com/deinnovatie/omnifetch.git
cd omnifetch
pip install -e ".[dev]"
```
## 🚀 Quick Start
### Basic Usage
```python
from omnifetch import DataManager
# Initialize with configuration
config = {
"storage": {
"backend": "local",
"local": {
"base_path": "./data"
}
},
"data_sources": {
"my_dataset": {
"backend": "local",
"ttl_seconds": 3600, # 1 hour
"file_patterns": {
"default": "datasets/my_data.parquet"
}
}
}
}
manager = DataManager(config)
# Fetch data (automatically cached)
data = manager.get_data("my_dataset", "default")
# Force refresh from source
fresh_data = manager.get_data("my_dataset", "default", force_refresh=True)
```
### With IBM Cloud Object Storage
```python
import os
from omnifetch import DataManager
# Set environment variables
os.environ["COS_ENDPOINT"] = "https://s3.us-south.cloud-object-storage.appdomain.cloud"
os.environ["COS_ACCESS_KEY"] = "your-access-key"
os.environ["COS_SECRET_KEY"] = "your-secret-key"
os.environ["COS_BUCKET"] = "your-bucket-name"
config = {
"storage": {
"backend": "ibm_cos",
"ibm_cos": {
"bucket_name": os.environ["COS_BUCKET"],
"service_endpoint": os.environ["COS_ENDPOINT"],
"access_key_env": "COS_ACCESS_KEY",
"secret_key_env": "COS_SECRET_KEY"
},
"local": {
"base_path": "./cache"
}
},
"data_sources": {
"cloud_dataset": {
"backend": "ibm_cos",
"ttl_seconds": 86400, # 24 hours
"file_patterns": {
"default": "datasets/cloud_data.parquet"
}
}
}
}
manager = DataManager(config)
data = manager.get_data("cloud_dataset", "default")
```
### Using YAML Configuration Adapter
```python
from omnifetch.adapters import create_datamanager_config
# Load configuration from YAML file (e.g., R pipeline config)
config = create_datamanager_config(config_path="./config/data_sources.yml")
manager = DataManager(config)
data = manager.get_data("dataset_name", "default")
```
### Batch Operations
```python
from omnifetch import DataManager, FileSpec
manager = DataManager(config)
# Define multiple files to fetch
specs = [
FileSpec("dataset_a", "default", {}),
FileSpec("dataset_b", "default", {}),
FileSpec("dataset_c", "regional", {"region": "us-east"}),
]
# Fetch all at once
results = manager.get_data_batch(specs)
for spec, data in zip(specs, results):
print(f"Loaded {spec.data_source}: {data.shape}")
```
### Cache Management
```python
# Get cache statistics
stats = manager.get_cache_stats()
print(f"Total cached files: {stats['file_count']}")
print(f"Fresh files: {stats['fresh_count']}")
print(f"Stale files: {stats['stale_count']}")
# Clear session cache (in-memory only)
manager.clear_session_cache()
```
## 🗂️ Configuration
### Configuration Structure
```yaml
storage:
backend: "ibm_cos" # or "local"
ibm_cos:
bucket_name: "my-bucket"
service_endpoint: "https://s3.region.cloud-object-storage.appdomain.cloud"
access_key_env: "COS_ACCESS_KEY"
secret_key_env: "COS_SECRET_KEY"
local:
base_path: "./cache"
data_sources:
dataset_name:
backend: "ibm_cos" # Backend for this specific dataset
ttl_seconds: 86400 # 24 hours (supports: "30d", "24h", "60m", "3600s", or integer)
file_patterns:
default: "path/to/file.parquet"
regional: "path/to/{region}/file.parquet" # With parameters
validation:
required_columns: ["id", "timestamp", "value"]
retry_strategy:
max_retries: 3
retry_delay: 5
backoff_multiplier: 2.0
```
### TTL Format
TTL can be specified in multiple formats:
- **Seconds**: `3600` or `"3600s"`
- **Minutes**: `"60m"`
- **Hours**: `"24h"`
- **Days**: `"30d"`
- **Weeks**: `"4w"`
### File Patterns with Parameters
```python
# Configuration
data_sources:
regional_data:
file_patterns:
default: "data/{year}/{month}/{region}.parquet"
# Usage
data = manager.get_data(
"regional_data",
"default",
year="2024",
month="10",
region="us-east"
)
# Fetches: data/2024/10/us-east.parquet
```
## 🔧 Supported Storage Backends
### Local Filesystem
```python
config = {
"storage": {
"backend": "local",
"local": {
"base_path": "./data_repository"
}
}
}
```
### IBM Cloud Object Storage (COS)
```python
config = {
"storage": {
"backend": "ibm_cos",
"ibm_cos": {
"bucket_name": "my-bucket",
"service_endpoint": "https://s3.us-south.cloud-object-storage.appdomain.cloud",
"access_key_env": "COS_ACCESS_KEY",
"secret_key_env": "COS_SECRET_KEY"
},
"local": {
"base_path": "./cache" # Local cache for downloaded files
}
}
}
```
### Custom Backends
Extend `StorageBackend` to create your own:
```python
from omnifetch.storage_backends import StorageBackend
class MyCustomBackend(StorageBackend):
def exists(self, path: str) -> bool:
# Implementation
pass
def save(self, data, path: str, format: str = "parquet") -> None:
# Implementation
pass
def load(self, path: str, format: str = "parquet"):
# Implementation
pass
# ... implement other abstract methods
```
## 📊 Supported Data Formats
- **Parquet** - Columnar storage (via PyArrow)
- **CSV** - Comma-separated values (via Pandas)
- **NetCDF** - Multidimensional arrays (via xarray/netCDF4)
- **Pickle** - Python object serialization
- **JSON** - Structured data
Format is auto-detected from file extension or can be specified explicitly.
## 🔍 How It Works
### Data Retrieval Flow
```
1. Request data from DataManager
↓
2. Check session cache (in-memory)
↓ (miss)
3. Check local disk cache with TTL
↓ (miss or stale)
4. Download from remote backend (IBM COS, etc.)
↓
5. Save to local cache with metadata
↓
6. Load from cache and store in session
↓
7. Return data to caller
```
### Cache Architecture
```
┌─────────────────────────────────────┐
│ DataManager (Session Cache) │ <- In-memory, per-run
└─────────────────────────────────────┘
↓
┌─────────────────────────────────────┐
│ CacheManager (Disk Cache + TTL) │ <- Persistent, with metadata
└─────────────────────────────────────┘
↓
┌─────────────────────────────────────┐
│ SyncManager (Backend Sync Logic) │ <- Download/upload coordination
└─────────────────────────────────────┘
↓
┌─────────────────────────────────────┐
│ StorageBackend (Local/IBM COS) │ <- Actual data source
└─────────────────────────────────────┘
```
## 🧪 Testing
```bash
# Run tests
pytest
# With coverage
pytest --cov=omnifetch --cov-report=html
# Run specific test
pytest tests/test_data_manager.py::test_get_data_from_cache
```
## 📝 Examples
See the `examples/` directory for complete working examples:
- [basic_usage.py](examples/basic_usage.py) - Basic local file caching
- [ibm_cos_example.py](examples/ibm_cos_example.py) - IBM COS integration
- [yaml_config_example.py](examples/yaml_config_example.py) - YAML configuration
- [custom_backend_example.py](examples/custom_backend_example.py) - Custom storage backend
## 🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 🙏 Acknowledgments
Originally developed as part of the SIMEG R Toolkit project for energy market simulations.
## 📧 Contact
Marco Bonoli - marco@deinnovatie.com
Project Link: [https://github.com/deinnovatie/omnifetch](https://github.com/deinnovatie/omnifetch)
Raw data
{
"_id": null,
"home_page": "https://github.com/deinnovatie/omnifetch",
"name": "omnifetch",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "data-management, caching, storage, ibm-cos, cloud-storage, ttl, multi-source",
"author": "Marco Bonoli",
"author_email": "Marco Bonoli <marco@deinnovatie.com>",
"download_url": "https://files.pythonhosted.org/packages/c0/12/e386cc512a278bf854c264c17376973cd3627d624d34e2fb49e66a70d345/omnifetch-1.0.0.tar.gz",
"platform": null,
"description": "# omnifetch\n\n**Multi-source data retrieval with intelligent caching and storage backends**\n\nFetch data from anywhere, any type - with automatic TTL-based caching, multiple storage backends, and smart synchronization.\n\n## \u2728 Features\n\n- \u2705 **Multi-source data retrieval** - Local files, IBM Cloud Object Storage, or custom backends\n- \u2705 **Intelligent TTL-based caching** - Automatic freshness management\n- \u2705 **Concurrent access protection** - File locking for safe multi-process usage\n- \u2705 **Batch operations** - Efficient bulk data retrieval\n- \u2705 **Retry strategies** - Configurable retry logic for reliability\n- \u2705 **Multiple data formats** - Parquet, CSV, NetCDF, pickle, and more\n- \u2705 **Session caching** - In-memory cache for single-run optimization\n- \u2705 **Flexible configuration** - YAML-based or programmatic setup\n\n## \ud83d\udce6 Installation\n\n### From GitHub\n\n```bash\npip install git+https://github.com/deinnovatie/omnifetch.git\n```\n\n### Local development\n\n```bash\ngit clone https://github.com/deinnovatie/omnifetch.git\ncd omnifetch\npip install -e \".[dev]\"\n```\n\n## \ud83d\ude80 Quick Start\n\n### Basic Usage\n\n```python\nfrom omnifetch import DataManager\n\n# Initialize with configuration\nconfig = {\n \"storage\": {\n \"backend\": \"local\",\n \"local\": {\n \"base_path\": \"./data\"\n }\n },\n \"data_sources\": {\n \"my_dataset\": {\n \"backend\": \"local\",\n \"ttl_seconds\": 3600, # 1 hour\n \"file_patterns\": {\n \"default\": \"datasets/my_data.parquet\"\n }\n }\n }\n}\n\nmanager = DataManager(config)\n\n# Fetch data (automatically cached)\ndata = manager.get_data(\"my_dataset\", \"default\")\n\n# Force refresh from source\nfresh_data = manager.get_data(\"my_dataset\", \"default\", force_refresh=True)\n```\n\n### With IBM Cloud Object Storage\n\n```python\nimport os\nfrom omnifetch import DataManager\n\n# Set environment variables\nos.environ[\"COS_ENDPOINT\"] = \"https://s3.us-south.cloud-object-storage.appdomain.cloud\"\nos.environ[\"COS_ACCESS_KEY\"] = \"your-access-key\"\nos.environ[\"COS_SECRET_KEY\"] = \"your-secret-key\"\nos.environ[\"COS_BUCKET\"] = \"your-bucket-name\"\n\nconfig = {\n \"storage\": {\n \"backend\": \"ibm_cos\",\n \"ibm_cos\": {\n \"bucket_name\": os.environ[\"COS_BUCKET\"],\n \"service_endpoint\": os.environ[\"COS_ENDPOINT\"],\n \"access_key_env\": \"COS_ACCESS_KEY\",\n \"secret_key_env\": \"COS_SECRET_KEY\"\n },\n \"local\": {\n \"base_path\": \"./cache\"\n }\n },\n \"data_sources\": {\n \"cloud_dataset\": {\n \"backend\": \"ibm_cos\",\n \"ttl_seconds\": 86400, # 24 hours\n \"file_patterns\": {\n \"default\": \"datasets/cloud_data.parquet\"\n }\n }\n }\n}\n\nmanager = DataManager(config)\ndata = manager.get_data(\"cloud_dataset\", \"default\")\n```\n\n### Using YAML Configuration Adapter\n\n```python\nfrom omnifetch.adapters import create_datamanager_config\n\n# Load configuration from YAML file (e.g., R pipeline config)\nconfig = create_datamanager_config(config_path=\"./config/data_sources.yml\")\n\nmanager = DataManager(config)\ndata = manager.get_data(\"dataset_name\", \"default\")\n```\n\n### Batch Operations\n\n```python\nfrom omnifetch import DataManager, FileSpec\n\nmanager = DataManager(config)\n\n# Define multiple files to fetch\nspecs = [\n FileSpec(\"dataset_a\", \"default\", {}),\n FileSpec(\"dataset_b\", \"default\", {}),\n FileSpec(\"dataset_c\", \"regional\", {\"region\": \"us-east\"}),\n]\n\n# Fetch all at once\nresults = manager.get_data_batch(specs)\n\nfor spec, data in zip(specs, results):\n print(f\"Loaded {spec.data_source}: {data.shape}\")\n```\n\n### Cache Management\n\n```python\n# Get cache statistics\nstats = manager.get_cache_stats()\nprint(f\"Total cached files: {stats['file_count']}\")\nprint(f\"Fresh files: {stats['fresh_count']}\")\nprint(f\"Stale files: {stats['stale_count']}\")\n\n# Clear session cache (in-memory only)\nmanager.clear_session_cache()\n```\n\n## \ud83d\uddc2\ufe0f Configuration\n\n### Configuration Structure\n\n```yaml\nstorage:\n backend: \"ibm_cos\" # or \"local\"\n ibm_cos:\n bucket_name: \"my-bucket\"\n service_endpoint: \"https://s3.region.cloud-object-storage.appdomain.cloud\"\n access_key_env: \"COS_ACCESS_KEY\"\n secret_key_env: \"COS_SECRET_KEY\"\n local:\n base_path: \"./cache\"\n\ndata_sources:\n dataset_name:\n backend: \"ibm_cos\" # Backend for this specific dataset\n ttl_seconds: 86400 # 24 hours (supports: \"30d\", \"24h\", \"60m\", \"3600s\", or integer)\n file_patterns:\n default: \"path/to/file.parquet\"\n regional: \"path/to/{region}/file.parquet\" # With parameters\n validation:\n required_columns: [\"id\", \"timestamp\", \"value\"]\n retry_strategy:\n max_retries: 3\n retry_delay: 5\n backoff_multiplier: 2.0\n```\n\n### TTL Format\n\nTTL can be specified in multiple formats:\n\n- **Seconds**: `3600` or `\"3600s\"`\n- **Minutes**: `\"60m\"`\n- **Hours**: `\"24h\"`\n- **Days**: `\"30d\"`\n- **Weeks**: `\"4w\"`\n\n### File Patterns with Parameters\n\n```python\n# Configuration\ndata_sources:\n regional_data:\n file_patterns:\n default: \"data/{year}/{month}/{region}.parquet\"\n\n# Usage\ndata = manager.get_data(\n \"regional_data\",\n \"default\",\n year=\"2024\",\n month=\"10\",\n region=\"us-east\"\n)\n# Fetches: data/2024/10/us-east.parquet\n```\n\n## \ud83d\udd27 Supported Storage Backends\n\n### Local Filesystem\n\n```python\nconfig = {\n \"storage\": {\n \"backend\": \"local\",\n \"local\": {\n \"base_path\": \"./data_repository\"\n }\n }\n}\n```\n\n### IBM Cloud Object Storage (COS)\n\n```python\nconfig = {\n \"storage\": {\n \"backend\": \"ibm_cos\",\n \"ibm_cos\": {\n \"bucket_name\": \"my-bucket\",\n \"service_endpoint\": \"https://s3.us-south.cloud-object-storage.appdomain.cloud\",\n \"access_key_env\": \"COS_ACCESS_KEY\",\n \"secret_key_env\": \"COS_SECRET_KEY\"\n },\n \"local\": {\n \"base_path\": \"./cache\" # Local cache for downloaded files\n }\n }\n}\n```\n\n### Custom Backends\n\nExtend `StorageBackend` to create your own:\n\n```python\nfrom omnifetch.storage_backends import StorageBackend\n\nclass MyCustomBackend(StorageBackend):\n def exists(self, path: str) -> bool:\n # Implementation\n pass\n\n def save(self, data, path: str, format: str = \"parquet\") -> None:\n # Implementation\n pass\n\n def load(self, path: str, format: str = \"parquet\"):\n # Implementation\n pass\n\n # ... implement other abstract methods\n```\n\n## \ud83d\udcca Supported Data Formats\n\n- **Parquet** - Columnar storage (via PyArrow)\n- **CSV** - Comma-separated values (via Pandas)\n- **NetCDF** - Multidimensional arrays (via xarray/netCDF4)\n- **Pickle** - Python object serialization\n- **JSON** - Structured data\n\nFormat is auto-detected from file extension or can be specified explicitly.\n\n## \ud83d\udd0d How It Works\n\n### Data Retrieval Flow\n\n```\n1. Request data from DataManager\n \u2193\n2. Check session cache (in-memory)\n \u2193 (miss)\n3. Check local disk cache with TTL\n \u2193 (miss or stale)\n4. Download from remote backend (IBM COS, etc.)\n \u2193\n5. Save to local cache with metadata\n \u2193\n6. Load from cache and store in session\n \u2193\n7. Return data to caller\n```\n\n### Cache Architecture\n\n```\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502 DataManager (Session Cache) \u2502 <- In-memory, per-run\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n \u2193\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502 CacheManager (Disk Cache + TTL) \u2502 <- Persistent, with metadata\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n \u2193\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502 SyncManager (Backend Sync Logic) \u2502 <- Download/upload coordination\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n \u2193\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502 StorageBackend (Local/IBM COS) \u2502 <- Actual data source\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n```\n\n## \ud83e\uddea Testing\n\n```bash\n# Run tests\npytest\n\n# With coverage\npytest --cov=omnifetch --cov-report=html\n\n# Run specific test\npytest tests/test_data_manager.py::test_get_data_from_cache\n```\n\n## \ud83d\udcdd Examples\n\nSee the `examples/` directory for complete working examples:\n\n- [basic_usage.py](examples/basic_usage.py) - Basic local file caching\n- [ibm_cos_example.py](examples/ibm_cos_example.py) - IBM COS integration\n- [yaml_config_example.py](examples/yaml_config_example.py) - YAML configuration\n- [custom_backend_example.py](examples/custom_backend_example.py) - Custom storage backend\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n1. Fork the repository\n2. Create your feature branch (`git checkout -b feature/amazing-feature`)\n3. Commit your changes (`git commit -m 'Add amazing feature'`)\n4. Push to the branch (`git push origin feature/amazing-feature`)\n5. Open a Pull Request\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## \ud83d\ude4f Acknowledgments\n\nOriginally developed as part of the SIMEG R Toolkit project for energy market simulations.\n\n## \ud83d\udce7 Contact\n\nMarco Bonoli - marco@deinnovatie.com\n\nProject Link: [https://github.com/deinnovatie/omnifetch](https://github.com/deinnovatie/omnifetch)\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Multi-source data retrieval with intelligent caching and storage backends",
"version": "1.0.0",
"project_urls": {
"Bug Tracker": "https://github.com/deinnovatie/omnifetch/issues",
"Documentation": "https://github.com/deinnovatie/omnifetch#readme",
"Homepage": "https://github.com/deinnovatie/omnifetch",
"Repository": "https://github.com/deinnovatie/omnifetch"
},
"split_keywords": [
"data-management",
" caching",
" storage",
" ibm-cos",
" cloud-storage",
" ttl",
" multi-source"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "ddde4ebeb7f8cdbcfff85e27e5f7eb5117d8e5caaa0e92d075da78c37a8a1779",
"md5": "6681a17e64272bb6cee4ef9543346d06",
"sha256": "19c59e2282300b0c922802682c15cf0696ca7e8a65402d23a1bcd7692e2b5ec4"
},
"downloads": -1,
"filename": "omnifetch-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6681a17e64272bb6cee4ef9543346d06",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 31338,
"upload_time": "2025-10-10T20:41:13",
"upload_time_iso_8601": "2025-10-10T20:41:13.791851Z",
"url": "https://files.pythonhosted.org/packages/dd/de/4ebeb7f8cdbcfff85e27e5f7eb5117d8e5caaa0e92d075da78c37a8a1779/omnifetch-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "c012e386cc512a278bf854c264c17376973cd3627d624d34e2fb49e66a70d345",
"md5": "3eadd727ff21607829b5c642cf32da45",
"sha256": "19b24d10d3f99620d0f987ec76b5c38d10ba06e4a115d7264382fda711787c92"
},
"downloads": -1,
"filename": "omnifetch-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "3eadd727ff21607829b5c642cf32da45",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 31842,
"upload_time": "2025-10-10T20:41:15",
"upload_time_iso_8601": "2025-10-10T20:41:15.552270Z",
"url": "https://files.pythonhosted.org/packages/c0/12/e386cc512a278bf854c264c17376973cd3627d624d34e2fb49e66a70d345/omnifetch-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-10 20:41:15",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "deinnovatie",
"github_project": "omnifetch",
"github_not_found": true,
"lcname": "omnifetch"
}