# ModelBridge
[](https://badge.fury.io/py/modelbridge)
[](https://www.python.org/downloads/)
**Simple Multi-Provider LLM Gateway**
ModelBridge is a Python library that provides unified access to multiple AI providers. It simplifies working with OpenAI, Anthropic, Google, and Groq APIs through a single interface.
## ✨ What it Does
- **Multi-Provider Support**: Works with OpenAI, Groq, Google, and Anthropic
- **Smart Model Selection**: Automatically picks models based on your task
- **Cost Tracking**: Shows you how much each request costs
- **Simple API**: One function (`ask`) for most use cases
- **Environment Variables**: Secure API key management
## 🚀 Quick Start
### Installation
```bash
pip install modelbridge
```
### Basic Usage
```python
import asyncio
from modelbridge.simple import ask
# Set your API keys as environment variables:
# export OPENAI_API_KEY='your_openai_key'
# export GROQ_API_KEY='your_groq_key'
async def main():
response = await ask("What is the capital of France?")
print(f"Answer: {response.content}")
print(f"Model: {response.model_id}")
print(f"Provider: {response.provider_name}")
print(f"Cost: ${response.cost:.4f}")
asyncio.run(main())
```
### Advanced Usage
```python
# Specify a particular model
response = await ask("Write Python code", model="gpt-4")
# Optimize for speed, cost, or quality
response = await ask("Simple math", optimize_for="cost")
response = await ask("Complex analysis", optimize_for="quality")
# Set maximum cost limit
response = await ask("Quick question", max_cost=0.001)
# Other useful functions
from modelbridge.simple import code, translate, summarize
# Generate code
response = await code("Create a sorting function", language="python")
# Translate text
response = await translate("Hello world", "Spanish")
# Summarize text
response = await summarize("Long article text here...", length="short")
```
## Configuration
### API Keys
Set your API keys as environment variables:
```bash
# Add the providers you want to use
export OPENAI_API_KEY='sk-your-openai-key'
export GROQ_API_KEY='gsk_your-groq-key'
export GOOGLE_API_KEY='your-google-key'
export ANTHROPIC_API_KEY='sk-ant-your-anthropic-key'
```
You need at least one API key for ModelBridge to work.
## Supported Providers
- **OpenAI**: GPT-4, GPT-3.5-turbo, GPT-5-mini
- **Groq**: Llama 3.3 (very fast)
- **Google**: Gemini models
- **Anthropic**: Claude models
## How It Works
1. You call `ask("your question")`
2. ModelBridge analyzes your prompt
3. It picks the best model for the task
4. Returns response with cost tracking
5. Automatically handles errors and retries
## License
MIT License
---
**Simple AI provider integration.**
Raw data
{
"_id": null,
"home_page": "https://github.com/code-mohanprakash/modelbridge",
"name": "modelbridge",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "llm, ai, gateway, routing, openai, anthropic, google, groq, multi-provider, load-balancing",
"author": "Mohan Prakash",
"author_email": "Mohan Prakash <mohanprkash462@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/ef/a7/612f53082d1a8e9dd8d6fc92e59b5430d185cecc189e72c1cd8e18b78bf8/modelbridge-0.1.7.tar.gz",
"platform": null,
"description": "# ModelBridge\n\n[](https://badge.fury.io/py/modelbridge)\n[](https://www.python.org/downloads/)\n\n**Simple Multi-Provider LLM Gateway**\n\nModelBridge is a Python library that provides unified access to multiple AI providers. It simplifies working with OpenAI, Anthropic, Google, and Groq APIs through a single interface.\n\n## \u2728 What it Does\n\n- **Multi-Provider Support**: Works with OpenAI, Groq, Google, and Anthropic\n- **Smart Model Selection**: Automatically picks models based on your task\n- **Cost Tracking**: Shows you how much each request costs\n- **Simple API**: One function (`ask`) for most use cases\n- **Environment Variables**: Secure API key management\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\n```bash\npip install modelbridge\n```\n\n### Basic Usage\n\n```python\nimport asyncio\nfrom modelbridge.simple import ask\n\n# Set your API keys as environment variables:\n# export OPENAI_API_KEY='your_openai_key'\n# export GROQ_API_KEY='your_groq_key'\n\nasync def main():\n response = await ask(\"What is the capital of France?\")\n print(f\"Answer: {response.content}\")\n print(f\"Model: {response.model_id}\")\n print(f\"Provider: {response.provider_name}\")\n print(f\"Cost: ${response.cost:.4f}\")\n\nasyncio.run(main())\n```\n\n### Advanced Usage\n\n```python\n# Specify a particular model\nresponse = await ask(\"Write Python code\", model=\"gpt-4\")\n\n# Optimize for speed, cost, or quality\nresponse = await ask(\"Simple math\", optimize_for=\"cost\")\nresponse = await ask(\"Complex analysis\", optimize_for=\"quality\")\n\n# Set maximum cost limit\nresponse = await ask(\"Quick question\", max_cost=0.001)\n\n# Other useful functions\nfrom modelbridge.simple import code, translate, summarize\n\n# Generate code\nresponse = await code(\"Create a sorting function\", language=\"python\")\n\n# Translate text\nresponse = await translate(\"Hello world\", \"Spanish\")\n\n# Summarize text\nresponse = await summarize(\"Long article text here...\", length=\"short\")\n```\n\n## Configuration\n\n### API Keys\n\nSet your API keys as environment variables:\n\n```bash\n# Add the providers you want to use\nexport OPENAI_API_KEY='sk-your-openai-key'\nexport GROQ_API_KEY='gsk_your-groq-key'\nexport GOOGLE_API_KEY='your-google-key'\nexport ANTHROPIC_API_KEY='sk-ant-your-anthropic-key'\n```\n\nYou need at least one API key for ModelBridge to work.\n\n## Supported Providers\n\n- **OpenAI**: GPT-4, GPT-3.5-turbo, GPT-5-mini\n- **Groq**: Llama 3.3 (very fast)\n- **Google**: Gemini models\n- **Anthropic**: Claude models\n\n## How It Works\n\n1. You call `ask(\"your question\")`\n2. ModelBridge analyzes your prompt\n3. It picks the best model for the task\n4. Returns response with cost tracking\n5. Automatically handles errors and retries\n\n## License\n\nMIT License\n\n---\n\n**Simple AI provider integration.**\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Simple Multi-Provider LLM Gateway",
"version": "0.1.7",
"project_urls": {
"Homepage": "https://github.com/code-mohanprakash/modelbridge",
"Issues": "https://github.com/code-mohanprakash/modelbridge/issues",
"Repository": "https://github.com/code-mohanprakash/modelbridge"
},
"split_keywords": [
"llm",
" ai",
" gateway",
" routing",
" openai",
" anthropic",
" google",
" groq",
" multi-provider",
" load-balancing"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "9abf6f05953b170fd3a920252ca1d435e0e7b4f727a742644ebf3aca25f0bbb1",
"md5": "aeee9d0ddfd9443aa508a894099b7ee6",
"sha256": "1795b77961679e267018c47feeb29d3d2f2233ece7f24d0f3d4a81b7d044b867"
},
"downloads": -1,
"filename": "modelbridge-0.1.7-py3-none-any.whl",
"has_sig": false,
"md5_digest": "aeee9d0ddfd9443aa508a894099b7ee6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 260522,
"upload_time": "2025-08-10T11:55:55",
"upload_time_iso_8601": "2025-08-10T11:55:55.396310Z",
"url": "https://files.pythonhosted.org/packages/9a/bf/6f05953b170fd3a920252ca1d435e0e7b4f727a742644ebf3aca25f0bbb1/modelbridge-0.1.7-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "efa7612f53082d1a8e9dd8d6fc92e59b5430d185cecc189e72c1cd8e18b78bf8",
"md5": "1801bdf23a3114a02fca7be70605c3f2",
"sha256": "8299aadae90a9dfd4f379339f89bfe6132fd4ae7db8c4fffd2cacbd996147c6c"
},
"downloads": -1,
"filename": "modelbridge-0.1.7.tar.gz",
"has_sig": false,
"md5_digest": "1801bdf23a3114a02fca7be70605c3f2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 229728,
"upload_time": "2025-08-10T11:55:57",
"upload_time_iso_8601": "2025-08-10T11:55:57.268465Z",
"url": "https://files.pythonhosted.org/packages/ef/a7/612f53082d1a8e9dd8d6fc92e59b5430d185cecc189e72c1cd8e18b78bf8/modelbridge-0.1.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-10 11:55:57",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "code-mohanprakash",
"github_project": "modelbridge",
"github_not_found": true,
"lcname": "modelbridge"
}