Name | Version | Summary | date |
novaeval |
0.5.2 |
A comprehensive, open-source LLM evaluation framework for testing and benchmarking AI models |
2025-08-29 06:18:10 |
uqlm |
0.2.6 |
UQLM (Uncertainty Quantification for Language Models) is a Python package for UQ-based LLM hallucination detection. |
2025-08-28 18:14:08 |
galileo-core |
3.64.4 |
Shared schemas and configuration for Galileo's Python packages. |
2025-08-28 15:40:27 |
nova-ci-rescue |
0.4.3 |
Nova CI-Rescue: automated test fixing agent (MVP) |
2025-08-28 15:11:18 |
llm-mcp-cli |
1.0.5 |
LLM plugin for Model Context Protocol (MCP) integration |
2025-08-28 14:35:59 |
portia-sdk-python |
0.7.3 |
Portia Labs Python SDK for building agentic workflows. |
2025-08-28 11:32:44 |
openaivec |
0.14.14 |
Generative mutation for tabular calculation |
2025-08-28 09:38:41 |
mcp-server-time-allenash-pypi-test |
0.6.2 |
A Model Context Protocol server providing tools for time queries and timezone conversions for LLMs |
2025-08-28 07:16:08 |
imagesorcery-mcp |
0.11.4 |
A Model Context Protocol server providing image manipulation tools for LLMs |
2025-08-28 04:56:42 |
evalassist |
0.1.26 |
EvalAssist is an open-source project that simplifies using large language models as evaluators (LLM-as-a-Judge) of the output of other large language models by supporting users in iteratively refining evaluation criteria in a web-based user experience. |
2025-08-28 01:10:48 |
openai-http-proxy |
0.2.1 |
"OpenAI HTTP Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:10 |
oai-proxy |
0.2.1 |
"OAI Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:09 |
lm-proxy-server |
0.2.1 |
"LM Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:08 |
lm-proxy |
0.2.1 |
"LM-Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:06 |
llm-proxy-server |
0.2.1 |
"LLM Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:05 |
inference-proxy |
0.2.1 |
"Inference Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:04 |
ai-proxy-server |
0.2.1 |
"AI Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:03 |
ai-dynamo-runtime |
0.4.1 |
Dynamo Inference Framework Runtime |
2025-08-27 23:23:55 |
kalavai-client |
0.7.2 |
Client app for kalavai platform |
2025-08-27 20:33:57 |
mchat-core |
0.1.4 |
Framework for building multi-agent chat applications with autogen |
2025-08-27 01:37:13 |