Name | Version | Summary | date |
ChemInformant |
2.4.1 |
A robust and high-throughput Python client for the PubChem API, designed for automated data retrieval and analysis |
2025-08-24 09:31:46 |
llm_batch_helper |
0.2.0 |
A Python package that enables batch submission of prompts to LLM APIs, with built-in async capabilities and response caching. |
2025-08-23 15:33:46 |
nimble-llm-caller |
0.2.2 |
A robust, multi-model LLM calling package with intelligent context management, file processing, and advanced prompt handling |
2025-08-22 05:40:15 |
batchata |
0.4.7 |
Unified Python API for AI batch requests with 50% cost savings on OpenAI and Anthropic |
2025-08-20 02:06:30 |
git-batch |
4.0.12 |
Clone single branch from all repositories listed in a file. |
2025-08-17 18:55:44 |
universal-printer |
3.0.0 |
Cross-platform document printing with enhanced PDF generation |
2025-08-09 12:04:01 |
bipl |
0.6.3 |
Openslide/libtiff/GDAL ndarray-like interface and lazy parallel tile-based processing |
2025-08-08 23:04:54 |
openai-so-batch |
0.1.1 |
Python library for creating and managing OpenAI Structured Outputs batch API calls |
2025-08-04 17:53:55 |
azllm |
0.1.6 |
A Python package that provides an easier user interface for multiple LLM providers. |
2025-07-31 01:06:46 |
opsqueue |
0.30.2 |
Python client library for Opsqueue, the lightweight batch processing queue for heavy loads |
2025-07-30 08:19:58 |
m9ini |
1.0.4 |
m9 ini configuration |
2025-07-28 21:20:52 |
django-routines |
1.6.1 |
Define named groups of management commands in Django settings files for batched execution. |
2025-07-25 20:47:00 |
openai-batch |
0.3.2 |
Make OpenAI batch easy to use. |
2025-07-23 18:25:16 |
enhanced-chinese-translator |
1.0.1 |
High-performance Chinese to English translation tool with multi-threading and batch processing |
2025-07-23 07:16:54 |
model-ensembler |
0.6.2 |
Model Ensembler for managed batch workflows |
2025-07-22 15:31:57 |
m9lib |
1.0.1 |
m9 utility library |
2025-07-20 17:49:29 |
llm-batch-helper |
0.1.2 |
A Python package that enables batch submission of prompts to LLM APIs, with built-in async capabilities and response caching. |
2025-07-16 17:35:51 |
jay-mcp |
0.2.1 |
MCP server for batch GitHub repository information retrieval with parallel processing |
2025-07-14 08:14:22 |
codeflare-sdk |
0.30.0 |
Python SDK for codeflare client |
2025-07-08 13:20:55 |
mh-tgtools |
0.3.1 |
A collection of tools to manage Ceragon by Siklu MultiHaul TG radios |
2025-02-02 23:36:28 |