Name | Version | Summary | date |
rwkv-tokenizer |
0.11.0 |
RWKV Tokenizer |
2025-07-16 09:53:48 |
openaivec |
0.9.7 |
Generative mutation for tabular calculation |
2025-07-16 05:00:53 |
chainlit |
2.6.2 |
Build Conversational AI. |
2025-07-16 04:29:41 |
vectara-agentic |
0.3.0 |
A Python package for creating AI Assistants and AI Agents with Vectara |
2025-07-16 03:45:23 |
mtllm |
0.3.9 |
MTLLM Provides Easy to use APIs for different LLM Providers to be used with Jaseci's Jaclang Programming Language. |
2025-07-16 03:40:54 |
llm-tavily |
0.0.1 |
LLM plugin to interact with Tavily API |
2025-07-16 00:22:33 |
llm-togetherai |
0.1.0 |
LLM plugin for models hosted by Together AI |
2025-07-15 23:37:09 |
pytest-brightest |
0.1.0 |
Bright ideas for improving your pytest experience |
2025-07-15 22:43:15 |
neuro-san |
0.5.45 |
NeuroAI data-driven System for multi-Agent Networks - client, library and server |
2025-07-15 19:49:43 |
llm-digitalocean |
0.1.0 |
LLM plugin for models hosted by DigitalOcean AI |
2025-07-15 19:43:10 |
llmrelic |
0.1.3 |
A lightweight Python library that provides easy access to popular LLM model names and allows you to define which models your application supports. |
2025-07-15 14:13:19 |
forgellm |
0.4.2 |
A comprehensive toolkit for end-to-end continued pre-training, fine-tuning, monitoring, testing and publishing of language models with MLX-LM |
2025-07-15 11:31:33 |
slimcontext |
0.9.0 |
A tool to turn a git repository into context for LLMs. |
2025-07-15 04:06:25 |
llm-litellm |
0.2.0 |
LLM plugin for LiteLLM proxy server |
2025-07-15 03:43:47 |
llm-requesty |
0.1.0 |
LLM plugin for models hosted by requesty |
2025-07-15 03:33:42 |
any-llm-client |
3.2.1 |
Add your description here |
2025-07-14 14:33:08 |
autobyteus-llm-client |
1.1.1 |
Async Python client for Autobyteus LLM API |
2025-07-14 10:37:21 |
talktollm |
0.4.2 |
A Python utility for interacting with large language models (LLMs) via web automation |
2025-07-14 10:04:04 |
chuk-tool-processor |
0.6.2 |
Async-native framework for registering, discovering, and executing tools referenced in LLM responses |
2025-07-14 08:40:41 |
codeflash |
0.15.5 |
Client for codeflash.ai - automatic code performance optimization, powered by AI |
2025-07-14 04:37:52 |