Name | Version | Summary | date |
agent-oauth |
0.0.1 |
A Python SDK for MCP tool integration with LLM providers |
2025-07-28 07:34:12 |
agentic-oauth |
0.0.1 |
A Python SDK for MCP tool integration with LLM providers |
2025-07-28 07:31:03 |
agentic-auth |
0.0.1 |
A Python SDK for MCP tool integration with LLM providers |
2025-07-28 07:30:44 |
auth-agent |
0.0.1 |
A Python SDK for MCP tool integration with LLM providers |
2025-07-28 07:30:16 |
mcp-better-auth |
0.0.1 |
A Python SDK for MCP tool integration with LLM providers |
2025-07-28 07:28:59 |
agent-better-auth |
0.0.1 |
A Python SDK for MCP tool integration with LLM providers |
2025-07-28 07:28:43 |
metis-agent |
0.3.0 |
A framework for building powerful AI agents with intelligent memory management and minimal boilerplate code |
2025-07-28 02:22:33 |
batchata |
0.4.0 |
Unified Python API for AI batch requests with 50% cost savings on OpenAI and Anthropic |
2025-07-27 17:08:53 |
status-whimsy |
0.1.0 |
A lightweight library for generating whimsical status updates using Claude Haiku 3 |
2025-07-27 16:00:22 |
zapgpt |
3.1.3 |
A command-line tool for interacting with various LLM providers |
2025-07-27 13:33:43 |
indoxrouter |
0.1.25 |
A unified client for various AI providers |
2025-07-27 10:28:11 |
parllama |
0.3.28 |
Terminal UI for Ollama and other LLM providers |
2025-07-26 15:59:02 |
metorial |
1.0.0rc3 |
Python SDK for Metorial - AI-powered tool calling and session management |
2025-07-26 12:37:24 |
metorial-anthropic |
1.0.0rc3 |
Anthropic (Claude) provider for Metorial |
2025-07-26 12:37:20 |
structllm |
0.1.0 |
Universal Python library for Structured Outputs with any LLM provider |
2025-07-24 23:32:32 |
bestehorn-llmmanager |
0.1.13 |
A comprehensive Python library for managing AWS Bedrock Converse API interactions with multi-model support, intelligent retry logic, and parallel processing capabilities |
2025-07-24 12:40:22 |
claude-code-sdk |
0.0.17 |
Python SDK for Claude Code |
2025-07-24 05:51:36 |
yaicli |
0.8.10 |
A simple CLI tool to interact with LLM |
2025-07-24 05:37:04 |
revenium-mcp |
0.1.25 |
Model Context Protocol (MCP) server for Revenium API monitoring, analytics, and AI spending management |
2025-07-23 22:50:50 |
observee |
0.1.20 |
Observee SDK - Tool usage logging, monitoring, authentication, and agent system for LLM integrations |
2025-07-23 22:45:55 |