Name | Version | Summary | date |
catzilla |
0.2.0 |
Ultra-fast Python web framework with C-accelerated routing |
2025-09-02 13:38:07 |
impit |
0.6.0 |
A library for making HTTP requests through browser impersonation |
2025-09-02 11:42:52 |
janus-python-sdk |
0.2.91 |
Automatically stress-test your AI apps. Sign up for an api key at withjanus.com |
2025-09-02 01:47:53 |
webdavit |
0.1.0 |
A simple async WebDAV client for Python |
2025-09-01 23:20:32 |
turso-python |
1.8 |
A Python client for Turso/libsql HTTP API with sync and async support |
2025-09-01 08:42:06 |
barcode-server |
2.5.0 |
A simple daemon to expose USB Barcode Scanner data to other services using Websockets, Webhooks or MQTT. |
2025-08-31 22:26:12 |
flowllm |
0.1.3 |
A flexible framework for building LLM-powered flows and mcp services |
2025-08-31 08:17:10 |
purpletrader |
0.1.1 |
A lightweight Python client for the Amherst College Quant Club Live Trading Engine. |
2025-08-31 00:56:11 |
omni-pathlib |
0.5.0 |
一个统一的路径处理库,支持本地文件系统、HTTP 和 S3 存储的路径操作 |
2025-08-30 17:16:20 |
mcp-server-multi-fetch |
0.8.1 |
A Model Context Protocol server providing tools to fetch and convert web content for usage by LLMs |
2025-08-30 02:10:38 |
mcp-server-fetch2 |
1.6.4 |
A Model Context Protocol server providing tools to fetch and convert web content for usage by LLMs |
2025-08-29 15:43:32 |
reflectapi-runtime |
0.1.0 |
Runtime library for ReflectAPI Python clients |
2025-08-29 10:17:52 |
graphql-http |
1.6.2 |
HTTP Server for GraphQL. |
2025-08-29 08:15:00 |
ry |
0.0.54 |
ry = rust + python |
2025-08-29 00:27:41 |
openai-http-proxy |
0.2.1 |
"OpenAI HTTP Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:10 |
oai-proxy |
0.2.1 |
"OAI Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:09 |
lm-proxy-server |
0.2.1 |
"LM Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:08 |
lm-proxy |
0.2.1 |
"LM-Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:06 |
llm-proxy-server |
0.2.1 |
"LLM Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:05 |
inference-proxy |
0.2.1 |
"Inference Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. |
2025-08-28 00:33:04 |