# AI Gateway 统一调用包
统一的 AI Gateway 包,支持 Azure OpenAI、OpenRouter 和 Skywork 三种服务商,通过指定服务商和模型名即可最简化调用。
## 🚀 快速开始
### 1. 安装包
**方式一:从源码安装(推荐)**
```bash
# 克隆或下载项目
git clone <your-repo-url>
cd gateways
# 安装包(开发模式,修改代码后立即生效)
pip install -e .
```
**方式二:从本地安装**
```bash
# 在项目根目录执行
pip install .
```
**方式三:构建分发包后安装**
```bash
# 安装构建工具(如果还没有)
pip install setuptools wheel
# 构建分发包
python setup.py sdist bdist_wheel
# 安装分发包
pip install dist/ai-gateways-1.0.0.tar.gz
# 或使用 wheel 文件
pip install dist/ai_gateways-1.0.0-py3-none-any.whl
```
**方式四:发布到 PyPI 后安装(如果已发布)**
```bash
pip install ai-gateways
```
**安装完成后,包会自动安装所有依赖:**
- openai>=1.0.0
- python-dotenv>=1.0.0
- requests>=2.31.0
- urllib3>=1.26.0
### 2. 配置 API Key
**重要:** 你的 API key 是隐私信息,不会被包含在包中。你需要自己配置。
有两种方式配置 API key:
**方式一:使用代码配置(推荐,更安全)**
```python
from gateways import configure_api_keys, chat
# 配置 API key
configure_api_keys(
azure_api_key="your-azure-api-key",
azure_endpoint="https://your-endpoint.cognitiveservices.azure.com/",
openrouter_api_key="your-openrouter-api-key",
)
# 使用
reply = chat("你好", "gpt-4o-mini", provider="azure")
```
**方式二:使用 .env 文件**
1. 复制模板文件:
```bash
cp .env.example .env
```
2. 编辑 `.env` 文件,填写你的真实 API key:
```bash
# Azure OpenAI(可选)
AZURE_OPENAI_API_KEY=your-azure-api-key
AZURE_OPENAI_ENDPOINT=https://your-endpoint.cognitiveservices.azure.com/
# OpenRouter(可选)
OPENROUTER_API_KEY=your-openrouter-api-key
OPENROUTER_SITE_URL=https://your-site.com # 可选
OPENROUTER_SITE_NAME=Your Site Name # 可选
# Skywork(可选,支持 GPT 和 Gemini)
# 如果使用 GPT 模型,需要配置:
OPENAI_BASE_URL=your-openai-base-url
OPENAI_API_KEY=your-openai-api-key
# 如果使用 Gemini 模型,需要配置:
GOOGLE_BASE_URL=your-google-base-url
GOOGLE_API_KEY=your-google-api-key
```
**注意:**
- `.env` 文件不会被上传到 PyPI,你的 API key 是安全的
- 所有服务商的 API key 都统一放在根目录的 `.env` 文件中
- 至少需要配置一个服务商的 API key
- 推荐使用 `configure_api_keys()` 函数配置,更安全
### 3. 使用
```python
from gateways import chat
# 使用统一模型名(推荐)✨
# 系统会自动将 'gpt-4o-mini' 映射到各服务商的实际模型名
reply = chat("你好", "gpt-4o-mini", provider="azure") # Azure: gpt-4o-mini
reply = chat("你好", "gpt-4o-mini", provider="openrouter") # OpenRouter: openai/gpt-4o-mini
reply = chat("你好", "gpt-4o", provider="skywork") # Skywork: gpt-4o
reply = chat("你好", "gemini-2.5-pro", provider="skywork") # Skywork: gemini-2.5-pro
# 如果不指定provider,会自动检测可用的服务商
reply = chat("你好", "gpt-4o-mini") # 自动使用Azure(如果已配置)
# 也可以直接使用完整模型ID(向后兼容)
reply = chat("你好", "openai/gpt-4o-mini", provider="openrouter")
```
## 📖 API 文档
### `chat(prompt, model, provider=None, **kwargs)`
同步调用 AI 模型(最简单的方式)
**参数:**
- `prompt` (str): 用户消息
- `model` (str): **统一模型名称**(推荐),如 `'gpt-4o-mini'`
- 系统会自动映射到各服务商的实际模型名
- Azure: `'gpt-4o-mini'` → `'gpt-4o-mini'` (部署名称)
- OpenRouter: `'gpt-4o-mini'` → `'openai/gpt-4o-mini'`
- 也支持直接使用完整模型ID(如 `'openai/gpt-4o-mini'`),会直接使用
- `provider` (str, 可选): 服务商 (`'azure'` 或 `'openrouter'`),如果为None则自动检测
- `**kwargs`: 其他参数(temperature, max_tokens等)
**返回:**
- `str`: 模型回复内容
**示例:**
```python
from gateways import chat
# 使用统一模型名(推荐)✨
reply = chat("解释什么是人工智能", "gpt-4o-mini", provider="azure", temperature=0.7)
reply = chat("解释什么是人工智能", "gpt-4o-mini", provider="openrouter") # 自动映射为 openai/gpt-4o-mini
# 使用OpenRouter的其他模型
reply = chat("解释什么是人工智能", "claude-3-haiku", provider="openrouter") # 自动映射为 anthropic/claude-3-haiku
reply = chat("解释什么是人工智能", "gemma-free", provider="openrouter") # 自动映射为 google/gemma-3n-e2b-it:free
# 也可以直接使用完整模型ID(向后兼容)
reply = chat("解释什么是人工智能", "openai/gpt-4o-mini", provider="openrouter")
```
### `chat_async(prompt, model, provider=None, **kwargs)`
异步调用 AI 模型
**示例:**
```python
import asyncio
from gateways import chat_async
async def main():
reply = await chat_async("你好", "gpt-4o-mini", provider="azure")
print(reply)
asyncio.run(main())
```
### `chat_with_history(messages, model, provider=None, **kwargs)`
使用消息历史进行对话
**示例:**
```python
from gateways import chat_with_history
messages = [
{"role": "user", "content": "你好"},
{"role": "assistant", "content": "你好!有什么可以帮助你的?"},
{"role": "user", "content": "今天天气怎么样?"}
]
reply = chat_with_history(messages, "gpt-4o-mini", provider="azure")
```
### `get_client(provider=None, async_mode=False, **kwargs)`
获取客户端实例(高级用法)
**示例:**
```python
from gateways import get_client
client = get_client(provider="azure")
completion = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "你好"}]
)
```
### `set_provider(provider)`
设置默认服务商
**示例:**
```python
from gateways import set_provider
set_provider("azure") # 设置Azure为默认服务商
reply = chat("你好", "gpt-4o-mini") # 自动使用Azure
```
### `configure_api_keys(...)`
通过代码配置 API key(优先于环境变量)
**参数:**
- `azure_api_key`: Azure OpenAI API key
- `azure_endpoint`: Azure OpenAI endpoint
- `openrouter_api_key`: OpenRouter API key
- `openrouter_site_url`: OpenRouter site URL(可选)
- `openrouter_site_name`: OpenRouter site name(可选)
- `skywork_openai_base_url`: Skywork OpenAI base URL
- `skywork_openai_api_key`: Skywork OpenAI API key
- `skywork_google_base_url`: Skywork Google base URL
- `skywork_google_api_key`: Skywork Google API key
**示例:**
```python
from gateways import configure_api_keys, chat
# 配置 API key
configure_api_keys(
azure_api_key="your-azure-key",
azure_endpoint="https://your-endpoint.cognitiveservices.azure.com/",
openrouter_api_key="your-openrouter-key",
)
# 使用
reply = chat("你好", "gpt-4o-mini", provider="azure")
```
**注意:** 代码配置的 key 会优先于环境变量使用。
## 🎯 常用模型(统一模型名)
### 通用模型(所有服务商都支持)
- `gpt-4o-mini` - GPT-4o Mini(推荐,快速且经济)
- `gpt-4o` - GPT-4o(最新模型)
- `gpt-4-turbo` - GPT-4 Turbo
### OpenRouter 特有模型
- `claude-3-haiku` - Claude 3 Haiku(快速且便宜)
- `claude-3-sonnet` - Claude 3 Sonnet(平衡)
- `claude-3-opus` - Claude 3 Opus(最强)
- `gemma-free` - Google Gemma(免费)
- `gemini-free` - Google Gemini 2.0 Flash(免费)
- `gemini-pro` - Google Gemini Pro
- `llama-3.3-8b` - Llama 3.3 8B(免费)
- `llama-3.1-8b` - Llama 3.1 8B(免费)
### Skywork 特有模型
**GPT 模型:**
- `gpt-4o-mini` - GPT-4o Mini
- `gpt-4o` - GPT-4o
- `gpt-4o-2024-11-20` - GPT-4o 特定版本
- `gpt-4` - GPT-4
- `gpt-4.1` - GPT-4.1
**Gemini 模型:**
- `gemini-2.5-pro` - Gemini 2.5 Pro(最佳性能)
- `gemini-2.5-flash` - Gemini 2.5 Flash(快速)
- `gemini-2.0-flash` - Gemini 2.0 Flash
- `gemini-2.5-flash-preview` - Gemini 2.5 Flash 预览版
- `gemini-2.5-flash-lite` - Gemini 2.5 Flash Lite
- `gemini-2.0-flash-lite` - Gemini 2.0 Flash Lite
### 查看所有可用模型
```python
from gateways import get_available_models
# 查看所有服务商的模型映射
models = get_available_models()
# 查看特定服务商的模型
azure_models = get_available_models("azure")
openrouter_models = get_available_models("openrouter")
```
**注意:** 使用统一模型名时,系统会自动映射到各服务商的实际模型名。例如:
- `gpt-4o-mini` 在 Azure 中映射为 `gpt-4o-mini`
- `gpt-4o-mini` 在 OpenRouter 中映射为 `openai/gpt-4o-mini`
## 💡 使用示例
### 基础用法(使用统一模型名)
```python
from gateways import chat
# 使用统一模型名,系统自动映射
reply = chat("什么是人工智能?", "gpt-4o-mini", provider="azure")
print(reply)
# 同样的模型名,不同的服务商
reply_azure = chat("什么是人工智能?", "gpt-4o-mini", provider="azure")
reply_openrouter = chat("什么是人工智能?", "gpt-4o-mini", provider="openrouter")
```
### 指定参数
```python
from gateways import chat
reply = chat(
"写一首关于春天的诗",
"gpt-4o-mini",
provider="azure",
temperature=0.8,
max_tokens=500
)
```
### 多轮对话
```python
from gateways import chat_with_history
messages = [
{"role": "system", "content": "你是一个友好的助手"},
{"role": "user", "content": "你好"},
{"role": "assistant", "content": "你好!有什么可以帮助你的?"},
{"role": "user", "content": "告诉我一个笑话"}
]
reply = chat_with_history(messages, "gpt-4o-mini", provider="azure")
print(reply)
```
### 异步调用
```python
import asyncio
from gateways import chat_async
async def main():
tasks = [
chat_async("什么是AI?", "gpt-4o-mini", provider="azure"),
chat_async("什么是机器学习?", "gpt-4o-mini", provider="azure"),
]
replies = await asyncio.gather(*tasks)
for reply in replies:
print(reply)
asyncio.run(main())
```
## 🔧 配置说明
### 自动检测服务商
如果不指定 `provider`,系统会按以下顺序自动检测:
1. 如果设置了默认服务商(通过 `set_provider()`),使用默认服务商
2. 如果配置了 Azure(`AZURE_OPENAI_API_KEY` 和 `AZURE_OPENAI_ENDPOINT`),使用 Azure
3. 如果配置了 OpenRouter(`OPENROUTER_API_KEY`),使用 OpenRouter
4. 如果都没有配置,抛出错误
### 环境变量
**Azure OpenAI:**
- `AZURE_OPENAI_API_KEY` - API密钥(必需)
- `AZURE_OPENAI_ENDPOINT` - 端点URL(必需)
**OpenRouter:**
- `OPENROUTER_API_KEY` - API密钥(必需)
- `OPENROUTER_SITE_URL` - 网站URL(可选)
- `OPENROUTER_SITE_NAME` - 网站名称(可选)
**Skywork:**
- `OPENAI_BASE_URL` - OpenAI API 基础URL(用于 GPT 模型)
- `OPENAI_API_KEY` - OpenAI API 密钥(用于 GPT 模型)
- `GOOGLE_BASE_URL` - Google API 基础URL(用于 Gemini 模型)
- `GOOGLE_API_KEY` - Google API 密钥(用于 Gemini 模型)
**注意:** 不再需要在子目录(`azure/`、`openrouter/` 或 `skywork/`)下单独配置 `.env` 文件,所有配置都统一在根目录的 `.env` 文件中。
## 📁 项目结构
```
gateways/
├── __init__.py # 包入口
├── gateway.py # 核心实现
├── requirements.txt # 依赖
├── README.md # 文档
├── azure/ # Azure相关文件
└── openrouter/ # OpenRouter相关文件
```
## ⚡ 特性
- ✅ **统一接口** - 一个API支持多个服务商
- ✅ **自动检测** - 自动选择可用的服务商
- ✅ **客户端缓存** - 自动复用连接,提高性能
- ✅ **同步/异步** - 支持同步和异步调用
- ✅ **简单易用** - 一行代码即可调用
- ✅ **灵活配置** - 支持环境变量和代码配置
## 🆘 故障排查
### 错误:未找到可用的服务商配置
确保至少配置了一个服务商的 API key 和必要的环境变量。
### 错误:不支持的提供商
确保 `provider` 参数是 `'azure'` 或 `'openrouter'`。
### Azure 调用失败
检查:
1. `AZURE_OPENAI_API_KEY` 是否正确
2. `AZURE_OPENAI_ENDPOINT` 是否正确
3. 模型部署名称是否正确
### OpenRouter 调用失败
检查:
1. `OPENROUTER_API_KEY` 是否正确
2. 账户余额是否充足
3. 模型ID是否正确(格式:`provider/model-name`)
Raw data
{
"_id": null,
"home_page": "https://github.com/yourusername/ai-gateways",
"name": "ai-gateways",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "ai, openai, azure, openrouter, skywork, llm, gpt, gemini",
"author": "Your Name",
"author_email": "Your Name <your.email@example.com>",
"download_url": "https://files.pythonhosted.org/packages/08/2c/3a658774710d27f326a74776d54b1f7e9f67e7c88c95b63e726b997b3c48/ai_gateways-1.0.0.tar.gz",
"platform": null,
"description": "# AI Gateway \u7edf\u4e00\u8c03\u7528\u5305\n\n\u7edf\u4e00\u7684 AI Gateway \u5305\uff0c\u652f\u6301 Azure OpenAI\u3001OpenRouter \u548c Skywork \u4e09\u79cd\u670d\u52a1\u5546\uff0c\u901a\u8fc7\u6307\u5b9a\u670d\u52a1\u5546\u548c\u6a21\u578b\u540d\u5373\u53ef\u6700\u7b80\u5316\u8c03\u7528\u3002\n\n## \ud83d\ude80 \u5feb\u901f\u5f00\u59cb\n\n### 1. \u5b89\u88c5\u5305\n\n**\u65b9\u5f0f\u4e00\uff1a\u4ece\u6e90\u7801\u5b89\u88c5\uff08\u63a8\u8350\uff09**\n\n```bash\n# \u514b\u9686\u6216\u4e0b\u8f7d\u9879\u76ee\ngit clone <your-repo-url>\ncd gateways\n\n# \u5b89\u88c5\u5305\uff08\u5f00\u53d1\u6a21\u5f0f\uff0c\u4fee\u6539\u4ee3\u7801\u540e\u7acb\u5373\u751f\u6548\uff09\npip install -e .\n```\n\n**\u65b9\u5f0f\u4e8c\uff1a\u4ece\u672c\u5730\u5b89\u88c5**\n\n```bash\n# \u5728\u9879\u76ee\u6839\u76ee\u5f55\u6267\u884c\npip install .\n```\n\n**\u65b9\u5f0f\u4e09\uff1a\u6784\u5efa\u5206\u53d1\u5305\u540e\u5b89\u88c5**\n\n```bash\n# \u5b89\u88c5\u6784\u5efa\u5de5\u5177\uff08\u5982\u679c\u8fd8\u6ca1\u6709\uff09\npip install setuptools wheel\n\n# \u6784\u5efa\u5206\u53d1\u5305\npython setup.py sdist bdist_wheel\n\n# \u5b89\u88c5\u5206\u53d1\u5305\npip install dist/ai-gateways-1.0.0.tar.gz\n# \u6216\u4f7f\u7528 wheel \u6587\u4ef6\npip install dist/ai_gateways-1.0.0-py3-none-any.whl\n```\n\n**\u65b9\u5f0f\u56db\uff1a\u53d1\u5e03\u5230 PyPI \u540e\u5b89\u88c5\uff08\u5982\u679c\u5df2\u53d1\u5e03\uff09**\n\n```bash\npip install ai-gateways\n```\n\n**\u5b89\u88c5\u5b8c\u6210\u540e\uff0c\u5305\u4f1a\u81ea\u52a8\u5b89\u88c5\u6240\u6709\u4f9d\u8d56\uff1a**\n- openai>=1.0.0\n- python-dotenv>=1.0.0\n- requests>=2.31.0\n- urllib3>=1.26.0\n\n### 2. \u914d\u7f6e API Key\n\n**\u91cd\u8981\uff1a** \u4f60\u7684 API key \u662f\u9690\u79c1\u4fe1\u606f\uff0c\u4e0d\u4f1a\u88ab\u5305\u542b\u5728\u5305\u4e2d\u3002\u4f60\u9700\u8981\u81ea\u5df1\u914d\u7f6e\u3002\n\n\u6709\u4e24\u79cd\u65b9\u5f0f\u914d\u7f6e API key\uff1a\n\n**\u65b9\u5f0f\u4e00\uff1a\u4f7f\u7528\u4ee3\u7801\u914d\u7f6e\uff08\u63a8\u8350\uff0c\u66f4\u5b89\u5168\uff09**\n\n```python\nfrom gateways import configure_api_keys, chat\n\n# \u914d\u7f6e API key\nconfigure_api_keys(\n azure_api_key=\"your-azure-api-key\",\n azure_endpoint=\"https://your-endpoint.cognitiveservices.azure.com/\",\n openrouter_api_key=\"your-openrouter-api-key\",\n)\n\n# \u4f7f\u7528\nreply = chat(\"\u4f60\u597d\", \"gpt-4o-mini\", provider=\"azure\")\n```\n\n**\u65b9\u5f0f\u4e8c\uff1a\u4f7f\u7528 .env \u6587\u4ef6**\n\n1. \u590d\u5236\u6a21\u677f\u6587\u4ef6\uff1a\n```bash\ncp .env.example .env\n```\n\n2. \u7f16\u8f91 `.env` \u6587\u4ef6\uff0c\u586b\u5199\u4f60\u7684\u771f\u5b9e API key\uff1a\n\n```bash\n# Azure OpenAI\uff08\u53ef\u9009\uff09\nAZURE_OPENAI_API_KEY=your-azure-api-key\nAZURE_OPENAI_ENDPOINT=https://your-endpoint.cognitiveservices.azure.com/\n\n# OpenRouter\uff08\u53ef\u9009\uff09\nOPENROUTER_API_KEY=your-openrouter-api-key\nOPENROUTER_SITE_URL=https://your-site.com # \u53ef\u9009\nOPENROUTER_SITE_NAME=Your Site Name # \u53ef\u9009\n\n# Skywork\uff08\u53ef\u9009\uff0c\u652f\u6301 GPT \u548c Gemini\uff09\n# \u5982\u679c\u4f7f\u7528 GPT \u6a21\u578b\uff0c\u9700\u8981\u914d\u7f6e\uff1a\nOPENAI_BASE_URL=your-openai-base-url\nOPENAI_API_KEY=your-openai-api-key\n# \u5982\u679c\u4f7f\u7528 Gemini \u6a21\u578b\uff0c\u9700\u8981\u914d\u7f6e\uff1a\nGOOGLE_BASE_URL=your-google-base-url\nGOOGLE_API_KEY=your-google-api-key\n```\n\n**\u6ce8\u610f\uff1a** \n- `.env` \u6587\u4ef6\u4e0d\u4f1a\u88ab\u4e0a\u4f20\u5230 PyPI\uff0c\u4f60\u7684 API key \u662f\u5b89\u5168\u7684\n- \u6240\u6709\u670d\u52a1\u5546\u7684 API key \u90fd\u7edf\u4e00\u653e\u5728\u6839\u76ee\u5f55\u7684 `.env` \u6587\u4ef6\u4e2d\n- \u81f3\u5c11\u9700\u8981\u914d\u7f6e\u4e00\u4e2a\u670d\u52a1\u5546\u7684 API key\n- \u63a8\u8350\u4f7f\u7528 `configure_api_keys()` \u51fd\u6570\u914d\u7f6e\uff0c\u66f4\u5b89\u5168\n\n### 3. \u4f7f\u7528\n\n```python\nfrom gateways import chat\n\n# \u4f7f\u7528\u7edf\u4e00\u6a21\u578b\u540d\uff08\u63a8\u8350\uff09\u2728\n# \u7cfb\u7edf\u4f1a\u81ea\u52a8\u5c06 'gpt-4o-mini' \u6620\u5c04\u5230\u5404\u670d\u52a1\u5546\u7684\u5b9e\u9645\u6a21\u578b\u540d\nreply = chat(\"\u4f60\u597d\", \"gpt-4o-mini\", provider=\"azure\") # Azure: gpt-4o-mini\nreply = chat(\"\u4f60\u597d\", \"gpt-4o-mini\", provider=\"openrouter\") # OpenRouter: openai/gpt-4o-mini\nreply = chat(\"\u4f60\u597d\", \"gpt-4o\", provider=\"skywork\") # Skywork: gpt-4o\nreply = chat(\"\u4f60\u597d\", \"gemini-2.5-pro\", provider=\"skywork\") # Skywork: gemini-2.5-pro\n\n# \u5982\u679c\u4e0d\u6307\u5b9aprovider\uff0c\u4f1a\u81ea\u52a8\u68c0\u6d4b\u53ef\u7528\u7684\u670d\u52a1\u5546\nreply = chat(\"\u4f60\u597d\", \"gpt-4o-mini\") # \u81ea\u52a8\u4f7f\u7528Azure\uff08\u5982\u679c\u5df2\u914d\u7f6e\uff09\n\n# \u4e5f\u53ef\u4ee5\u76f4\u63a5\u4f7f\u7528\u5b8c\u6574\u6a21\u578bID\uff08\u5411\u540e\u517c\u5bb9\uff09\nreply = chat(\"\u4f60\u597d\", \"openai/gpt-4o-mini\", provider=\"openrouter\")\n```\n\n## \ud83d\udcd6 API \u6587\u6863\n\n### `chat(prompt, model, provider=None, **kwargs)`\n\n\u540c\u6b65\u8c03\u7528 AI \u6a21\u578b\uff08\u6700\u7b80\u5355\u7684\u65b9\u5f0f\uff09\n\n**\u53c2\u6570\uff1a**\n- `prompt` (str): \u7528\u6237\u6d88\u606f\n- `model` (str): **\u7edf\u4e00\u6a21\u578b\u540d\u79f0**\uff08\u63a8\u8350\uff09\uff0c\u5982 `'gpt-4o-mini'`\n - \u7cfb\u7edf\u4f1a\u81ea\u52a8\u6620\u5c04\u5230\u5404\u670d\u52a1\u5546\u7684\u5b9e\u9645\u6a21\u578b\u540d\n - Azure: `'gpt-4o-mini'` \u2192 `'gpt-4o-mini'` (\u90e8\u7f72\u540d\u79f0)\n - OpenRouter: `'gpt-4o-mini'` \u2192 `'openai/gpt-4o-mini'`\n - \u4e5f\u652f\u6301\u76f4\u63a5\u4f7f\u7528\u5b8c\u6574\u6a21\u578bID\uff08\u5982 `'openai/gpt-4o-mini'`\uff09\uff0c\u4f1a\u76f4\u63a5\u4f7f\u7528\n- `provider` (str, \u53ef\u9009): \u670d\u52a1\u5546 (`'azure'` \u6216 `'openrouter'`)\uff0c\u5982\u679c\u4e3aNone\u5219\u81ea\u52a8\u68c0\u6d4b\n- `**kwargs`: \u5176\u4ed6\u53c2\u6570\uff08temperature, max_tokens\u7b49\uff09\n\n**\u8fd4\u56de\uff1a**\n- `str`: \u6a21\u578b\u56de\u590d\u5185\u5bb9\n\n**\u793a\u4f8b\uff1a**\n```python\nfrom gateways import chat\n\n# \u4f7f\u7528\u7edf\u4e00\u6a21\u578b\u540d\uff08\u63a8\u8350\uff09\u2728\nreply = chat(\"\u89e3\u91ca\u4ec0\u4e48\u662f\u4eba\u5de5\u667a\u80fd\", \"gpt-4o-mini\", provider=\"azure\", temperature=0.7)\nreply = chat(\"\u89e3\u91ca\u4ec0\u4e48\u662f\u4eba\u5de5\u667a\u80fd\", \"gpt-4o-mini\", provider=\"openrouter\") # \u81ea\u52a8\u6620\u5c04\u4e3a openai/gpt-4o-mini\n\n# \u4f7f\u7528OpenRouter\u7684\u5176\u4ed6\u6a21\u578b\nreply = chat(\"\u89e3\u91ca\u4ec0\u4e48\u662f\u4eba\u5de5\u667a\u80fd\", \"claude-3-haiku\", provider=\"openrouter\") # \u81ea\u52a8\u6620\u5c04\u4e3a anthropic/claude-3-haiku\nreply = chat(\"\u89e3\u91ca\u4ec0\u4e48\u662f\u4eba\u5de5\u667a\u80fd\", \"gemma-free\", provider=\"openrouter\") # \u81ea\u52a8\u6620\u5c04\u4e3a google/gemma-3n-e2b-it:free\n\n# \u4e5f\u53ef\u4ee5\u76f4\u63a5\u4f7f\u7528\u5b8c\u6574\u6a21\u578bID\uff08\u5411\u540e\u517c\u5bb9\uff09\nreply = chat(\"\u89e3\u91ca\u4ec0\u4e48\u662f\u4eba\u5de5\u667a\u80fd\", \"openai/gpt-4o-mini\", provider=\"openrouter\")\n```\n\n### `chat_async(prompt, model, provider=None, **kwargs)`\n\n\u5f02\u6b65\u8c03\u7528 AI \u6a21\u578b\n\n**\u793a\u4f8b\uff1a**\n```python\nimport asyncio\nfrom gateways import chat_async\n\nasync def main():\n reply = await chat_async(\"\u4f60\u597d\", \"gpt-4o-mini\", provider=\"azure\")\n print(reply)\n\nasyncio.run(main())\n```\n\n### `chat_with_history(messages, model, provider=None, **kwargs)`\n\n\u4f7f\u7528\u6d88\u606f\u5386\u53f2\u8fdb\u884c\u5bf9\u8bdd\n\n**\u793a\u4f8b\uff1a**\n```python\nfrom gateways import chat_with_history\n\nmessages = [\n {\"role\": \"user\", \"content\": \"\u4f60\u597d\"},\n {\"role\": \"assistant\", \"content\": \"\u4f60\u597d\uff01\u6709\u4ec0\u4e48\u53ef\u4ee5\u5e2e\u52a9\u4f60\u7684\uff1f\"},\n {\"role\": \"user\", \"content\": \"\u4eca\u5929\u5929\u6c14\u600e\u4e48\u6837\uff1f\"}\n]\n\nreply = chat_with_history(messages, \"gpt-4o-mini\", provider=\"azure\")\n```\n\n### `get_client(provider=None, async_mode=False, **kwargs)`\n\n\u83b7\u53d6\u5ba2\u6237\u7aef\u5b9e\u4f8b\uff08\u9ad8\u7ea7\u7528\u6cd5\uff09\n\n**\u793a\u4f8b\uff1a**\n```python\nfrom gateways import get_client\n\nclient = get_client(provider=\"azure\")\ncompletion = client.chat.completions.create(\n model=\"gpt-4o-mini\",\n messages=[{\"role\": \"user\", \"content\": \"\u4f60\u597d\"}]\n)\n```\n\n### `set_provider(provider)`\n\n\u8bbe\u7f6e\u9ed8\u8ba4\u670d\u52a1\u5546\n\n**\u793a\u4f8b\uff1a**\n```python\nfrom gateways import set_provider\n\nset_provider(\"azure\") # \u8bbe\u7f6eAzure\u4e3a\u9ed8\u8ba4\u670d\u52a1\u5546\nreply = chat(\"\u4f60\u597d\", \"gpt-4o-mini\") # \u81ea\u52a8\u4f7f\u7528Azure\n```\n\n### `configure_api_keys(...)`\n\n\u901a\u8fc7\u4ee3\u7801\u914d\u7f6e API key\uff08\u4f18\u5148\u4e8e\u73af\u5883\u53d8\u91cf\uff09\n\n**\u53c2\u6570\uff1a**\n- `azure_api_key`: Azure OpenAI API key\n- `azure_endpoint`: Azure OpenAI endpoint\n- `openrouter_api_key`: OpenRouter API key\n- `openrouter_site_url`: OpenRouter site URL\uff08\u53ef\u9009\uff09\n- `openrouter_site_name`: OpenRouter site name\uff08\u53ef\u9009\uff09\n- `skywork_openai_base_url`: Skywork OpenAI base URL\n- `skywork_openai_api_key`: Skywork OpenAI API key\n- `skywork_google_base_url`: Skywork Google base URL\n- `skywork_google_api_key`: Skywork Google API key\n\n**\u793a\u4f8b\uff1a**\n```python\nfrom gateways import configure_api_keys, chat\n\n# \u914d\u7f6e API key\nconfigure_api_keys(\n azure_api_key=\"your-azure-key\",\n azure_endpoint=\"https://your-endpoint.cognitiveservices.azure.com/\",\n openrouter_api_key=\"your-openrouter-key\",\n)\n\n# \u4f7f\u7528\nreply = chat(\"\u4f60\u597d\", \"gpt-4o-mini\", provider=\"azure\")\n```\n\n**\u6ce8\u610f\uff1a** \u4ee3\u7801\u914d\u7f6e\u7684 key \u4f1a\u4f18\u5148\u4e8e\u73af\u5883\u53d8\u91cf\u4f7f\u7528\u3002\n\n## \ud83c\udfaf \u5e38\u7528\u6a21\u578b\uff08\u7edf\u4e00\u6a21\u578b\u540d\uff09\n\n### \u901a\u7528\u6a21\u578b\uff08\u6240\u6709\u670d\u52a1\u5546\u90fd\u652f\u6301\uff09\n- `gpt-4o-mini` - GPT-4o Mini\uff08\u63a8\u8350\uff0c\u5feb\u901f\u4e14\u7ecf\u6d4e\uff09\n- `gpt-4o` - GPT-4o\uff08\u6700\u65b0\u6a21\u578b\uff09\n- `gpt-4-turbo` - GPT-4 Turbo\n\n### OpenRouter \u7279\u6709\u6a21\u578b\n- `claude-3-haiku` - Claude 3 Haiku\uff08\u5feb\u901f\u4e14\u4fbf\u5b9c\uff09\n- `claude-3-sonnet` - Claude 3 Sonnet\uff08\u5e73\u8861\uff09\n- `claude-3-opus` - Claude 3 Opus\uff08\u6700\u5f3a\uff09\n- `gemma-free` - Google Gemma\uff08\u514d\u8d39\uff09\n- `gemini-free` - Google Gemini 2.0 Flash\uff08\u514d\u8d39\uff09\n- `gemini-pro` - Google Gemini Pro\n- `llama-3.3-8b` - Llama 3.3 8B\uff08\u514d\u8d39\uff09\n- `llama-3.1-8b` - Llama 3.1 8B\uff08\u514d\u8d39\uff09\n\n### Skywork \u7279\u6709\u6a21\u578b\n**GPT \u6a21\u578b\uff1a**\n- `gpt-4o-mini` - GPT-4o Mini\n- `gpt-4o` - GPT-4o\n- `gpt-4o-2024-11-20` - GPT-4o \u7279\u5b9a\u7248\u672c\n- `gpt-4` - GPT-4\n- `gpt-4.1` - GPT-4.1\n\n**Gemini \u6a21\u578b\uff1a**\n- `gemini-2.5-pro` - Gemini 2.5 Pro\uff08\u6700\u4f73\u6027\u80fd\uff09\n- `gemini-2.5-flash` - Gemini 2.5 Flash\uff08\u5feb\u901f\uff09\n- `gemini-2.0-flash` - Gemini 2.0 Flash\n- `gemini-2.5-flash-preview` - Gemini 2.5 Flash \u9884\u89c8\u7248\n- `gemini-2.5-flash-lite` - Gemini 2.5 Flash Lite\n- `gemini-2.0-flash-lite` - Gemini 2.0 Flash Lite\n\n### \u67e5\u770b\u6240\u6709\u53ef\u7528\u6a21\u578b\n\n```python\nfrom gateways import get_available_models\n\n# \u67e5\u770b\u6240\u6709\u670d\u52a1\u5546\u7684\u6a21\u578b\u6620\u5c04\nmodels = get_available_models()\n\n# \u67e5\u770b\u7279\u5b9a\u670d\u52a1\u5546\u7684\u6a21\u578b\nazure_models = get_available_models(\"azure\")\nopenrouter_models = get_available_models(\"openrouter\")\n```\n\n**\u6ce8\u610f\uff1a** \u4f7f\u7528\u7edf\u4e00\u6a21\u578b\u540d\u65f6\uff0c\u7cfb\u7edf\u4f1a\u81ea\u52a8\u6620\u5c04\u5230\u5404\u670d\u52a1\u5546\u7684\u5b9e\u9645\u6a21\u578b\u540d\u3002\u4f8b\u5982\uff1a\n- `gpt-4o-mini` \u5728 Azure \u4e2d\u6620\u5c04\u4e3a `gpt-4o-mini`\n- `gpt-4o-mini` \u5728 OpenRouter \u4e2d\u6620\u5c04\u4e3a `openai/gpt-4o-mini`\n\n## \ud83d\udca1 \u4f7f\u7528\u793a\u4f8b\n\n### \u57fa\u7840\u7528\u6cd5\uff08\u4f7f\u7528\u7edf\u4e00\u6a21\u578b\u540d\uff09\n\n```python\nfrom gateways import chat\n\n# \u4f7f\u7528\u7edf\u4e00\u6a21\u578b\u540d\uff0c\u7cfb\u7edf\u81ea\u52a8\u6620\u5c04\nreply = chat(\"\u4ec0\u4e48\u662f\u4eba\u5de5\u667a\u80fd\uff1f\", \"gpt-4o-mini\", provider=\"azure\")\nprint(reply)\n\n# \u540c\u6837\u7684\u6a21\u578b\u540d\uff0c\u4e0d\u540c\u7684\u670d\u52a1\u5546\nreply_azure = chat(\"\u4ec0\u4e48\u662f\u4eba\u5de5\u667a\u80fd\uff1f\", \"gpt-4o-mini\", provider=\"azure\")\nreply_openrouter = chat(\"\u4ec0\u4e48\u662f\u4eba\u5de5\u667a\u80fd\uff1f\", \"gpt-4o-mini\", provider=\"openrouter\")\n```\n\n### \u6307\u5b9a\u53c2\u6570\n\n```python\nfrom gateways import chat\n\nreply = chat(\n \"\u5199\u4e00\u9996\u5173\u4e8e\u6625\u5929\u7684\u8bd7\",\n \"gpt-4o-mini\",\n provider=\"azure\",\n temperature=0.8,\n max_tokens=500\n)\n```\n\n### \u591a\u8f6e\u5bf9\u8bdd\n\n```python\nfrom gateways import chat_with_history\n\nmessages = [\n {\"role\": \"system\", \"content\": \"\u4f60\u662f\u4e00\u4e2a\u53cb\u597d\u7684\u52a9\u624b\"},\n {\"role\": \"user\", \"content\": \"\u4f60\u597d\"},\n {\"role\": \"assistant\", \"content\": \"\u4f60\u597d\uff01\u6709\u4ec0\u4e48\u53ef\u4ee5\u5e2e\u52a9\u4f60\u7684\uff1f\"},\n {\"role\": \"user\", \"content\": \"\u544a\u8bc9\u6211\u4e00\u4e2a\u7b11\u8bdd\"}\n]\n\nreply = chat_with_history(messages, \"gpt-4o-mini\", provider=\"azure\")\nprint(reply)\n```\n\n### \u5f02\u6b65\u8c03\u7528\n\n```python\nimport asyncio\nfrom gateways import chat_async\n\nasync def main():\n tasks = [\n chat_async(\"\u4ec0\u4e48\u662fAI\uff1f\", \"gpt-4o-mini\", provider=\"azure\"),\n chat_async(\"\u4ec0\u4e48\u662f\u673a\u5668\u5b66\u4e60\uff1f\", \"gpt-4o-mini\", provider=\"azure\"),\n ]\n replies = await asyncio.gather(*tasks)\n for reply in replies:\n print(reply)\n\nasyncio.run(main())\n```\n\n## \ud83d\udd27 \u914d\u7f6e\u8bf4\u660e\n\n### \u81ea\u52a8\u68c0\u6d4b\u670d\u52a1\u5546\n\n\u5982\u679c\u4e0d\u6307\u5b9a `provider`\uff0c\u7cfb\u7edf\u4f1a\u6309\u4ee5\u4e0b\u987a\u5e8f\u81ea\u52a8\u68c0\u6d4b\uff1a\n1. \u5982\u679c\u8bbe\u7f6e\u4e86\u9ed8\u8ba4\u670d\u52a1\u5546\uff08\u901a\u8fc7 `set_provider()`\uff09\uff0c\u4f7f\u7528\u9ed8\u8ba4\u670d\u52a1\u5546\n2. \u5982\u679c\u914d\u7f6e\u4e86 Azure\uff08`AZURE_OPENAI_API_KEY` \u548c `AZURE_OPENAI_ENDPOINT`\uff09\uff0c\u4f7f\u7528 Azure\n3. \u5982\u679c\u914d\u7f6e\u4e86 OpenRouter\uff08`OPENROUTER_API_KEY`\uff09\uff0c\u4f7f\u7528 OpenRouter\n4. \u5982\u679c\u90fd\u6ca1\u6709\u914d\u7f6e\uff0c\u629b\u51fa\u9519\u8bef\n\n### \u73af\u5883\u53d8\u91cf\n\n**Azure OpenAI:**\n- `AZURE_OPENAI_API_KEY` - API\u5bc6\u94a5\uff08\u5fc5\u9700\uff09\n- `AZURE_OPENAI_ENDPOINT` - \u7aef\u70b9URL\uff08\u5fc5\u9700\uff09\n\n**OpenRouter:**\n- `OPENROUTER_API_KEY` - API\u5bc6\u94a5\uff08\u5fc5\u9700\uff09\n- `OPENROUTER_SITE_URL` - \u7f51\u7ad9URL\uff08\u53ef\u9009\uff09\n- `OPENROUTER_SITE_NAME` - \u7f51\u7ad9\u540d\u79f0\uff08\u53ef\u9009\uff09\n\n**Skywork:**\n- `OPENAI_BASE_URL` - OpenAI API \u57fa\u7840URL\uff08\u7528\u4e8e GPT \u6a21\u578b\uff09\n- `OPENAI_API_KEY` - OpenAI API \u5bc6\u94a5\uff08\u7528\u4e8e GPT \u6a21\u578b\uff09\n- `GOOGLE_BASE_URL` - Google API \u57fa\u7840URL\uff08\u7528\u4e8e Gemini \u6a21\u578b\uff09\n- `GOOGLE_API_KEY` - Google API \u5bc6\u94a5\uff08\u7528\u4e8e Gemini \u6a21\u578b\uff09\n\n**\u6ce8\u610f\uff1a** \u4e0d\u518d\u9700\u8981\u5728\u5b50\u76ee\u5f55\uff08`azure/`\u3001`openrouter/` \u6216 `skywork/`\uff09\u4e0b\u5355\u72ec\u914d\u7f6e `.env` \u6587\u4ef6\uff0c\u6240\u6709\u914d\u7f6e\u90fd\u7edf\u4e00\u5728\u6839\u76ee\u5f55\u7684 `.env` \u6587\u4ef6\u4e2d\u3002\n\n## \ud83d\udcc1 \u9879\u76ee\u7ed3\u6784\n\n```\ngateways/\n\u251c\u2500\u2500 __init__.py # \u5305\u5165\u53e3\n\u251c\u2500\u2500 gateway.py # \u6838\u5fc3\u5b9e\u73b0\n\u251c\u2500\u2500 requirements.txt # \u4f9d\u8d56\n\u251c\u2500\u2500 README.md # \u6587\u6863\n\u251c\u2500\u2500 azure/ # Azure\u76f8\u5173\u6587\u4ef6\n\u2514\u2500\u2500 openrouter/ # OpenRouter\u76f8\u5173\u6587\u4ef6\n```\n\n## \u26a1 \u7279\u6027\n\n- \u2705 **\u7edf\u4e00\u63a5\u53e3** - \u4e00\u4e2aAPI\u652f\u6301\u591a\u4e2a\u670d\u52a1\u5546\n- \u2705 **\u81ea\u52a8\u68c0\u6d4b** - \u81ea\u52a8\u9009\u62e9\u53ef\u7528\u7684\u670d\u52a1\u5546\n- \u2705 **\u5ba2\u6237\u7aef\u7f13\u5b58** - \u81ea\u52a8\u590d\u7528\u8fde\u63a5\uff0c\u63d0\u9ad8\u6027\u80fd\n- \u2705 **\u540c\u6b65/\u5f02\u6b65** - \u652f\u6301\u540c\u6b65\u548c\u5f02\u6b65\u8c03\u7528\n- \u2705 **\u7b80\u5355\u6613\u7528** - \u4e00\u884c\u4ee3\u7801\u5373\u53ef\u8c03\u7528\n- \u2705 **\u7075\u6d3b\u914d\u7f6e** - \u652f\u6301\u73af\u5883\u53d8\u91cf\u548c\u4ee3\u7801\u914d\u7f6e\n\n## \ud83c\udd98 \u6545\u969c\u6392\u67e5\n\n### \u9519\u8bef\uff1a\u672a\u627e\u5230\u53ef\u7528\u7684\u670d\u52a1\u5546\u914d\u7f6e\n\n\u786e\u4fdd\u81f3\u5c11\u914d\u7f6e\u4e86\u4e00\u4e2a\u670d\u52a1\u5546\u7684 API key \u548c\u5fc5\u8981\u7684\u73af\u5883\u53d8\u91cf\u3002\n\n### \u9519\u8bef\uff1a\u4e0d\u652f\u6301\u7684\u63d0\u4f9b\u5546\n\n\u786e\u4fdd `provider` \u53c2\u6570\u662f `'azure'` \u6216 `'openrouter'`\u3002\n\n### Azure \u8c03\u7528\u5931\u8d25\n\n\u68c0\u67e5\uff1a\n1. `AZURE_OPENAI_API_KEY` \u662f\u5426\u6b63\u786e\n2. `AZURE_OPENAI_ENDPOINT` \u662f\u5426\u6b63\u786e\n3. \u6a21\u578b\u90e8\u7f72\u540d\u79f0\u662f\u5426\u6b63\u786e\n\n### OpenRouter \u8c03\u7528\u5931\u8d25\n\n\u68c0\u67e5\uff1a\n1. `OPENROUTER_API_KEY` \u662f\u5426\u6b63\u786e\n2. \u8d26\u6237\u4f59\u989d\u662f\u5426\u5145\u8db3\n3. \u6a21\u578bID\u662f\u5426\u6b63\u786e\uff08\u683c\u5f0f\uff1a`provider/model-name`\uff09\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "\u7edf\u4e00\u7684 AI Gateway \u5305\uff0c\u652f\u6301 Azure OpenAI\u3001OpenRouter \u548c Skywork \u4e09\u79cd\u670d\u52a1\u5546",
"version": "1.0.0",
"project_urls": {
"Documentation": "https://github.com/yourusername/ai-gateways#readme",
"Homepage": "https://github.com/yourusername/ai-gateways",
"Issues": "https://github.com/yourusername/ai-gateways/issues",
"Repository": "https://github.com/yourusername/ai-gateways"
},
"split_keywords": [
"ai",
" openai",
" azure",
" openrouter",
" skywork",
" llm",
" gpt",
" gemini"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "5162b24594d02314dc0b25dc9a05aba1cfe2d7211a9b6dd9daa46f3804cf6bf8",
"md5": "0d75108e7179010579f5463bb02466fe",
"sha256": "6c49c0a1da1899f91fbbfaacd0611d3c2e5a7188f6347733c92a4a99db838687"
},
"downloads": -1,
"filename": "ai_gateways-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0d75108e7179010579f5463bb02466fe",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 12967,
"upload_time": "2025-11-06T07:19:34",
"upload_time_iso_8601": "2025-11-06T07:19:34.847780Z",
"url": "https://files.pythonhosted.org/packages/51/62/b24594d02314dc0b25dc9a05aba1cfe2d7211a9b6dd9daa46f3804cf6bf8/ai_gateways-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "082c3a658774710d27f326a74776d54b1f7e9f67e7c88c95b63e726b997b3c48",
"md5": "0c99637aca8fdf6a6d4c9f258ccb9280",
"sha256": "41c608d8bdf5ababaf49bba9ed5783e57a26597158fad79456efdcee27723fa9"
},
"downloads": -1,
"filename": "ai_gateways-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "0c99637aca8fdf6a6d4c9f258ccb9280",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 13686,
"upload_time": "2025-11-06T07:19:36",
"upload_time_iso_8601": "2025-11-06T07:19:36.143314Z",
"url": "https://files.pythonhosted.org/packages/08/2c/3a658774710d27f326a74776d54b1f7e9f67e7c88c95b63e726b997b3c48/ai_gateways-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-11-06 07:19:36",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yourusername",
"github_project": "ai-gateways",
"github_not_found": true,
"lcname": "ai-gateways"
}