pydantic-ai-litellm


Namepydantic-ai-litellm JSON
Version 0.2.0 PyPI version JSON
download
home_pageNone
SummaryLiteLLM model integration for Pydantic AI framework - access 100+ LLM providers through a unified interface
upload_time2025-08-30 23:26:39
maintainerNone
docs_urlNone
authorNone
requires_python>=3.13
licenseNone
keywords pydantic ai llm litellm openai claude gemini machine-learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Pydantic AI LiteLLM

A LiteLLM model integration for the [Pydantic AI](https://ai.pydantic.dev/) framework, enabling access to 100+ LLM providers through a unified interface.

## Features

- **Universal LLM Access**: Connect to 100+ LLM providers (OpenAI, Anthropic, Cohere, Bedrock, Azure, and many more) via LiteLLM
- **Full Pydantic AI Integration**: Complete support for tool calling, streaming, structured outputs, and all Pydantic AI features
- **Type Safety**: Fully typed with comprehensive type hints
- **Async/Await Support**: Built for modern async Python applications
- **Flexible Configuration**: Support for custom API endpoints, headers, and provider-specific settings

## Installation

```bash
pip install pydantic-ai-litellm
```

## Quick Start

```python
import asyncio
from pydantic_ai import Agent
from pydantic_ai_litellm import LiteLLMModel

# Initialize with any LiteLLM-supported model
model = LiteLLMModel(
    model_name="gpt-4",  # or claude-3-opus, gemini-pro, etc.
    api_key="your-api-key"  # will also check environment variables
)

# Create an agent
agent = Agent(model=model)

# Run inference
async def main():
    result = await agent.run("What is the capital of France?")
    print(result.output)

asyncio.run(main())
```

## Supported Providers

This library supports all providers available through LiteLLM, including:

- **OpenAI**: GPT-4, GPT-3.5, o1, etc.
- **Anthropic**: Claude 3 (Opus, Sonnet, Haiku)
- **Google**: Gemini Pro, Gemini Flash
- **AWS Bedrock**: Claude, Titan, Cohere models
- **Azure OpenAI**: All Azure-hosted models
- **Cohere**: Command, Command R+
- **Mistral AI**: Mistral 7B, 8x7B, Large
- **And 90+ more providers**

See the [LiteLLM providers documentation](https://docs.litellm.ai/docs/providers) for the complete list.

## Advanced Usage

### Custom API Endpoints

```python
model = LiteLLMModel(
    model_name="custom-model",
    api_base="https://your-custom-endpoint.com/v1",
    api_key="your-api-key",
    custom_llm_provider="openai"  # specify provider format
)
```

### Tool Calling

```python
from pydantic_ai import Agent
from pydantic_ai_litellm import LiteLLMModel

def get_weather(location: str) -> str:
    """Get weather for a location."""
    return f"It's sunny in {location}"

model = LiteLLMModel("gpt-4")
agent = Agent(model=model, tools=[get_weather])

result = await agent.run("What's the weather in Paris?")
```

### Streaming

```python
async with agent.run_stream("Write a poem about AI") as stream:
    async for text in stream.stream_text(delta=True):
        print(text, end="", flush=True)
```

### Structured Output

```python
from pydantic import BaseModel

class Person(BaseModel):
    name: str
    age: int
    occupation: str

agent = Agent(model=model, output_type=Person)
result = await agent.run("Generate a person profile")
print(result.output.name)  # Typed as Person
```

## Configuration

You can configure the model with various settings:

```python
from pydantic_ai_litellm import LiteLLMModelSettings

settings: LiteLLMModelSettings = {
    'temperature': 0.7,
    'max_tokens': 1000,
    'litellm_api_key': 'your-key',
    'litellm_api_base': 'https://custom-endpoint.com',
    'extra_headers': {'Custom-Header': 'value'}
}

model = LiteLLMModel("gpt-4", settings=settings)
```

## Requirements

- Python 3.13+
- `pydantic-ai-slim>=0.6.2`
- `litellm>=1.75.5`

## Contributing

Contributions are welcome! Please feel free to submit issues and pull requests.

## License

MIT License - see LICENSE file for details.

## Examples

See the `examples/` directory for complete working examples:

- **Quick Start** (`examples/01_quick_start.py`) - Basic usage
- **Custom Endpoints** (`examples/02_custom_endpoints.py`) - Using custom API endpoints  
- **Tool Calling** (`examples/03_tool_calling.py`) - Functions as AI tools
- **Streaming** (`examples/04_streaming.py`) - Real-time text streaming
- **Structured Output** (`examples/05_structured_output.py`) - Typed responses with Pydantic
- **Configuration** (`examples/06_configuration.py`) - Model settings and parameters

Each example includes error handling and can be run independently with the appropriate API keys.

## Links

- [Pydantic AI Documentation](https://ai.pydantic.dev/)
- [LiteLLM Documentation](https://docs.litellm.ai/)
- [GitHub Repository](https://github.com/mochow13/pydantic-ai-litellm)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "pydantic-ai-litellm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.13",
    "maintainer_email": "Mottakin <md.mottakin.chowdhury@gmail.com>",
    "keywords": "pydantic, ai, llm, litellm, openai, claude, gemini, machine-learning",
    "author": null,
    "author_email": "Mottakin <md.mottakin.chowdhury@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/26/ca/0dee552d69a215d9d6b93e15a6e189e7b101a8905ccd11f4c42ce13b41ed/pydantic_ai_litellm-0.2.0.tar.gz",
    "platform": null,
    "description": "# Pydantic AI LiteLLM\n\nA LiteLLM model integration for the [Pydantic AI](https://ai.pydantic.dev/) framework, enabling access to 100+ LLM providers through a unified interface.\n\n## Features\n\n- **Universal LLM Access**: Connect to 100+ LLM providers (OpenAI, Anthropic, Cohere, Bedrock, Azure, and many more) via LiteLLM\n- **Full Pydantic AI Integration**: Complete support for tool calling, streaming, structured outputs, and all Pydantic AI features\n- **Type Safety**: Fully typed with comprehensive type hints\n- **Async/Await Support**: Built for modern async Python applications\n- **Flexible Configuration**: Support for custom API endpoints, headers, and provider-specific settings\n\n## Installation\n\n```bash\npip install pydantic-ai-litellm\n```\n\n## Quick Start\n\n```python\nimport asyncio\nfrom pydantic_ai import Agent\nfrom pydantic_ai_litellm import LiteLLMModel\n\n# Initialize with any LiteLLM-supported model\nmodel = LiteLLMModel(\n    model_name=\"gpt-4\",  # or claude-3-opus, gemini-pro, etc.\n    api_key=\"your-api-key\"  # will also check environment variables\n)\n\n# Create an agent\nagent = Agent(model=model)\n\n# Run inference\nasync def main():\n    result = await agent.run(\"What is the capital of France?\")\n    print(result.output)\n\nasyncio.run(main())\n```\n\n## Supported Providers\n\nThis library supports all providers available through LiteLLM, including:\n\n- **OpenAI**: GPT-4, GPT-3.5, o1, etc.\n- **Anthropic**: Claude 3 (Opus, Sonnet, Haiku)\n- **Google**: Gemini Pro, Gemini Flash\n- **AWS Bedrock**: Claude, Titan, Cohere models\n- **Azure OpenAI**: All Azure-hosted models\n- **Cohere**: Command, Command R+\n- **Mistral AI**: Mistral 7B, 8x7B, Large\n- **And 90+ more providers**\n\nSee the [LiteLLM providers documentation](https://docs.litellm.ai/docs/providers) for the complete list.\n\n## Advanced Usage\n\n### Custom API Endpoints\n\n```python\nmodel = LiteLLMModel(\n    model_name=\"custom-model\",\n    api_base=\"https://your-custom-endpoint.com/v1\",\n    api_key=\"your-api-key\",\n    custom_llm_provider=\"openai\"  # specify provider format\n)\n```\n\n### Tool Calling\n\n```python\nfrom pydantic_ai import Agent\nfrom pydantic_ai_litellm import LiteLLMModel\n\ndef get_weather(location: str) -> str:\n    \"\"\"Get weather for a location.\"\"\"\n    return f\"It's sunny in {location}\"\n\nmodel = LiteLLMModel(\"gpt-4\")\nagent = Agent(model=model, tools=[get_weather])\n\nresult = await agent.run(\"What's the weather in Paris?\")\n```\n\n### Streaming\n\n```python\nasync with agent.run_stream(\"Write a poem about AI\") as stream:\n    async for text in stream.stream_text(delta=True):\n        print(text, end=\"\", flush=True)\n```\n\n### Structured Output\n\n```python\nfrom pydantic import BaseModel\n\nclass Person(BaseModel):\n    name: str\n    age: int\n    occupation: str\n\nagent = Agent(model=model, output_type=Person)\nresult = await agent.run(\"Generate a person profile\")\nprint(result.output.name)  # Typed as Person\n```\n\n## Configuration\n\nYou can configure the model with various settings:\n\n```python\nfrom pydantic_ai_litellm import LiteLLMModelSettings\n\nsettings: LiteLLMModelSettings = {\n    'temperature': 0.7,\n    'max_tokens': 1000,\n    'litellm_api_key': 'your-key',\n    'litellm_api_base': 'https://custom-endpoint.com',\n    'extra_headers': {'Custom-Header': 'value'}\n}\n\nmodel = LiteLLMModel(\"gpt-4\", settings=settings)\n```\n\n## Requirements\n\n- Python 3.13+\n- `pydantic-ai-slim>=0.6.2`\n- `litellm>=1.75.5`\n\n## Contributing\n\nContributions are welcome! Please feel free to submit issues and pull requests.\n\n## License\n\nMIT License - see LICENSE file for details.\n\n## Examples\n\nSee the `examples/` directory for complete working examples:\n\n- **Quick Start** (`examples/01_quick_start.py`) - Basic usage\n- **Custom Endpoints** (`examples/02_custom_endpoints.py`) - Using custom API endpoints  \n- **Tool Calling** (`examples/03_tool_calling.py`) - Functions as AI tools\n- **Streaming** (`examples/04_streaming.py`) - Real-time text streaming\n- **Structured Output** (`examples/05_structured_output.py`) - Typed responses with Pydantic\n- **Configuration** (`examples/06_configuration.py`) - Model settings and parameters\n\nEach example includes error handling and can be run independently with the appropriate API keys.\n\n## Links\n\n- [Pydantic AI Documentation](https://ai.pydantic.dev/)\n- [LiteLLM Documentation](https://docs.litellm.ai/)\n- [GitHub Repository](https://github.com/mochow13/pydantic-ai-litellm)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "LiteLLM model integration for Pydantic AI framework - access 100+ LLM providers through a unified interface",
    "version": "0.2.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/mochow13/pydantic-ai-litellm/issues",
        "Documentation": "https://github.com/mochow13/pydantic-ai-litellm#readme",
        "Homepage": "https://github.com/mochow13/pydantic-ai-litellm",
        "Repository": "https://github.com/mochow13/pydantic-ai-litellm"
    },
    "split_keywords": [
        "pydantic",
        " ai",
        " llm",
        " litellm",
        " openai",
        " claude",
        " gemini",
        " machine-learning"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "24290607c4d68b05624165f747257c305e6f56fa7f4bf41b1fd6248ebd947bb3",
                "md5": "8a97c27b7f4dab80aa0f68a5d1db47a7",
                "sha256": "6a7a7b5a691cd6fd36092d7c6b2d8aff7f5192c3117c86922ebbd6b35ebc8410"
            },
            "downloads": -1,
            "filename": "pydantic_ai_litellm-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8a97c27b7f4dab80aa0f68a5d1db47a7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.13",
            "size": 9060,
            "upload_time": "2025-08-30T23:26:38",
            "upload_time_iso_8601": "2025-08-30T23:26:38.066635Z",
            "url": "https://files.pythonhosted.org/packages/24/29/0607c4d68b05624165f747257c305e6f56fa7f4bf41b1fd6248ebd947bb3/pydantic_ai_litellm-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "26ca0dee552d69a215d9d6b93e15a6e189e7b101a8905ccd11f4c42ce13b41ed",
                "md5": "5ff363ce2e8067114a3605789dacc47b",
                "sha256": "caa54b08018618088d205e9c9a03616ac0898a1cc0b8363cbff94ba739bec30e"
            },
            "downloads": -1,
            "filename": "pydantic_ai_litellm-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "5ff363ce2e8067114a3605789dacc47b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.13",
            "size": 8899,
            "upload_time": "2025-08-30T23:26:39",
            "upload_time_iso_8601": "2025-08-30T23:26:39.522574Z",
            "url": "https://files.pythonhosted.org/packages/26/ca/0dee552d69a215d9d6b93e15a6e189e7b101a8905ccd11f4c42ce13b41ed/pydantic_ai_litellm-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-30 23:26:39",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "mochow13",
    "github_project": "pydantic-ai-litellm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "pydantic-ai-litellm"
}
        
Elapsed time: 1.98282s