# Observee Agents
A Python SDK for seamless integration of MCP (Model Context Protocol) tools with multiple LLM providers including Anthropic Claude, OpenAI GPT, and Google Gemini.
**Configure as many MCP servers/tools as you need at [observee.ai](https://observee.ai)**
## Features
- 🤖 **Multi-Provider Support**: Works with Anthropic, OpenAI, and Gemini
- 🔧 **Smart Tool Filtering**: BM25, local embeddings, and cloud-based filtering
- ⚡ **Fast Performance**: Intelligent caching and optimization
- 🔑 **Flexible Authentication**: URL-based or API key authentication
- 🔐 **OAuth Integration**: Built-in authentication flows for Gmail, Slack, Notion, and 15+ services
- 🎯 **Easy Integration**: Simple sync/async API
- 📡 **Streaming Support**: Real-time streaming responses for Anthropic, OpenAI, and Gemini
- 📦 **Pip Installable**: Easy installation and distribution
## Installation
```bash
# Basic installation
pip install observee-agents
# With optional dependencies
pip install observee-agents[embedding,cloud]
# Development installation
pip install observee-agents[dev]
```
## Quick Start
### Simple Synchronous Usage (Recommended)
```python
from observee_agents import chat_with_tools
result = chat_with_tools(
message="Search for recent news about AI developments",
provider="anthropic",
model="claude-sonnet-4-20250514",
observee_api_key="obs_your_key_here"
)
print("Response:", result["content"])
print("Tools used:", len(result["tool_calls"]))
```
### Explore Available Tools
```python
from observee_agents import list_tools, get_tool_info, filter_tools
# List all available tools
tools = list_tools(observee_api_key="obs_your_key_here")
print(f"Found {len(tools)} tools:")
for tool in tools[:5]: # Show first 5
print(f"- {tool['name']}: {tool['description']}")
# Get detailed info about a specific tool
tool_info = get_tool_info(
tool_name="youtube_get_transcript",
observee_api_key="obs_your_key_here"
)
if tool_info:
print(f"Tool: {tool_info['name']}")
print(f"Description: {tool_info['description']}")
# Find relevant tools for a task
relevant_tools = filter_tools(
query="search YouTube videos",
max_tools=3,
observee_api_key="obs_your_key_here"
)
for tool in relevant_tools:
print(f"- {tool['name']} (relevance: {tool['relevance_score']})")
```
### Execute Tools Directly
```python
from observee_agents import execute_tool
# Execute a tool directly without LLM
result = execute_tool(
tool_name="youtube_get_transcript",
tool_input={"video_url": "https://youtube.com/watch?v=dQw4w9WgXcQ"},
observee_api_key="obs_your_key_here"
)
print(result)
```
### Streaming Responses
```python
import asyncio
from observee_agents import chat_with_tools_stream
async def stream_example():
async for chunk in chat_with_tools_stream(
message="What's the weather like today?",
provider="openai",
observee_api_key="obs_your_key_here"
):
if chunk["type"] == "content":
print(chunk["content"], end="", flush=True)
elif chunk["type"] == "tool_result":
print(f"\n[Tool executed: {chunk['tool_name']}]")
asyncio.run(stream_example())
```
### Advanced Async Usage
```python
import asyncio
from observee_agents import MCPAgent
async def advanced_example():
async with MCPAgent(
provider="anthropic",
server_url="wss://mcp.observee.ai/mcp?client_id=your_id",
auth_token="obs_your_key_here"
) as agent:
result = await agent.chat_with_tools(
message="What tools do you have access to?"
)
return result
result = asyncio.run(advanced_example())
print(result["content"])
```
### OAuth Authentication
The SDK includes built-in OAuth flows for authenticating with various services:
```python
from observee_agents import call_mcpauth_login, get_available_servers
# Get list of supported authentication servers
servers = get_available_servers()
print(f"Available servers: {servers['supported_servers']}")
# Start authentication flow for Gmail
response = call_mcpauth_login(auth_server="gmail")
print(f"Visit this URL to authenticate: {response['url']}")
# Start authentication flow for Slack with client ID
response = call_mcpauth_login(
auth_server="slack"
)
```
**Supported Services**: Gmail, Google Calendar, Google Docs, Google Drive, Google Sheets, Slack, Notion, Linear, Asana, Outlook, OneDrive, Atlassian, Supabase, Airtable, Discord, and more.
## Configuration
### Environment Variables
```bash
# Option 1: API Key (Recommended)
export OBSERVEE_API_KEY="obs_your_key_here"
export OBSERVEE_CLIENT_ID="your_client_id" # Optional
# Option 2: Direct URL
export OBSERVEE_URL="https://mcp.observee.ai/mcp"
# LLM Provider Keys
export ANTHROPIC_API_KEY="your_anthropic_key"
export OPENAI_API_KEY="your_openai_key"
export GOOGLE_API_KEY="your_google_key"
```
### Function Parameters
```python
from observee_agents import chat_with_tools
result = chat_with_tools(
message="Your query here",
# Provider Configuration
provider="anthropic", # "anthropic", "openai", "gemini"
model="claude-sonnet-4-20250514", # Auto-detected if not provided
# Authentication (priority: params > env vars)
observee_api_key="obs_your_key",
observee_url="https://custom.mcp.server/endpoint",
client_id="your_client_id",
# Tool Filtering
enable_filtering=True, # True for filtered tools, False for all tools
filter_type="bm25", # "bm25", "local_embedding", "cloud"
max_tools=20, # Maximum tools to filter
min_score=8.0, # Minimum relevance score
# Performance
sync_tools=False, # True to clear caches and resync
# Provider-specific args
temperature=0.7,
max_tokens=1000
)
```
## Examples
## Available Imports
```python
# Main chat functionality
from observee_agents import chat_with_tools, chat_with_tools_stream
# Tool exploration and management
from observee_agents import list_tools, get_tool_info, filter_tools, execute_tool
# Advanced usage
from observee_agents import MCPAgent
```
### Multiple Providers
```python
from observee_agents import chat_with_tools
# Anthropic Claude
result = chat_with_tools(
message="Analyze this YouTube video",
provider="anthropic",
model="claude-sonnet-4-20250514"
)
# OpenAI GPT
result = chat_with_tools(
message="Search for recent AI papers",
provider="openai",
model="gpt-4o"
)
# Google Gemini
result = chat_with_tools(
message="Help me manage my emails",
provider="gemini",
model="gemini-2.5-pro"
)
```
### Tool Filtering Options
```python
from observee_agents import chat_with_tools
# Fast BM25 keyword filtering (default)
result = chat_with_tools(
message="Find relevant tools",
filter_type="bm25",
max_tools=5
)
# Semantic embedding filtering
result = chat_with_tools(
message="Find relevant tools",
filter_type="local_embedding",
max_tools=10
)
# Cloud hybrid search (requires API keys)
result = chat_with_tools(
message="Find relevant tools",
filter_type="cloud",
max_tools=15
)
# No filtering - use all available tools
result = chat_with_tools(
message="What can you do?",
enable_filtering=False
)
```
### Custom Configuration
```python
from observee_agents import chat_with_tools
# Custom Observee server
result = chat_with_tools(
message="Custom server query",
observee_url="https://your-custom-server.com/mcp",
client_id="custom_client_123"
)
# Force cache refresh
result = chat_with_tools(
message="Get fresh results",
sync_tools=True # Clears caches
)
```
## Response Format
```python
{
"content": "The AI response text",
"tool_calls": [
{
"name": "tool_name",
"input": {"param": "value"}
}
],
"tool_results": [
{
"tool": "tool_name",
"result": "tool output"
}
],
"filtered_tools_count": 5,
"filtered_tools": ["tool1", "tool2", "tool3"],
"used_filtering": True
}
```
## Available Tools
The SDK provides access to various MCP tools including:
- **📧 Gmail**: Email management, search, compose, labels
- **🎥 YouTube**: Video transcript retrieval and analysis
- **📋 Linear**: Project management, issues, comments
- **🔍 Brave Search**: Web search and local business lookup
- **And many more...**
## Filter Types
### BM25 Filter (Default)
- **Speed**: ⚡ ~1-5ms per query
- **Best for**: Fast keyword matching, production use
- **Dependencies**: None (built-in)
### Local Embedding Filter
- **Speed**: ⚡ ~10ms per query
- **Best for**: Semantic search without cloud dependencies
- **Dependencies**: `fastembed`
### Cloud Filter
- **Speed**: 🐌 ~300-400ms per query
- **Best for**: Highest quality hybrid search
- **Dependencies**: `pinecone-client`, `openai`
- **Requirements**: `PINECONE_API_KEY`, `OPENAI_API_KEY`
## Development
```bash
# Clone and install in development mode
git clone https://github.com/observee-ai/mcp-agent-system.git #coming soon
cd mcp-agent-system
pip install -e .[dev]
# Run tests
pytest
# Format code
black observee_agents/
```
## License
All rights reserved. This software is proprietary and confidential. Unauthorized copying, distribution, or use is strictly prohibited.
## Support
- 📖 [Documentation](https://docs.observee.ai/mcp-agent-system)
- 🐛 [Issue Tracker](https://github.com/observee-ai/mcp-agent-system/issues)
- 💬 [Discord Community](https://discord.gg/jnf8yHWJ)
- 📧 [Email Support](mailto:contact@observee.ai)
Raw data
{
"_id": null,
"home_page": null,
"name": "oauth-mcp",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "Observee <contact@observee.ai>",
"keywords": "mcp, agent, llm, anthropic, openai, gemini, tools",
"author": null,
"author_email": "Observee <contact@observee.ai>",
"download_url": "https://files.pythonhosted.org/packages/d8/ab/d18264d05f4c82bab3067dea934a35ab7260c6906ceb0f434263cfd03999/oauth_mcp-0.0.2.tar.gz",
"platform": null,
"description": "# Observee Agents\n\nA Python SDK for seamless integration of MCP (Model Context Protocol) tools with multiple LLM providers including Anthropic Claude, OpenAI GPT, and Google Gemini.\n\n**Configure as many MCP servers/tools as you need at [observee.ai](https://observee.ai)**\n\n## Features\n\n- \ud83e\udd16 **Multi-Provider Support**: Works with Anthropic, OpenAI, and Gemini\n- \ud83d\udd27 **Smart Tool Filtering**: BM25, local embeddings, and cloud-based filtering\n- \u26a1 **Fast Performance**: Intelligent caching and optimization\n- \ud83d\udd11 **Flexible Authentication**: URL-based or API key authentication\n- \ud83d\udd10 **OAuth Integration**: Built-in authentication flows for Gmail, Slack, Notion, and 15+ services\n- \ud83c\udfaf **Easy Integration**: Simple sync/async API\n- \ud83d\udce1 **Streaming Support**: Real-time streaming responses for Anthropic, OpenAI, and Gemini\n- \ud83d\udce6 **Pip Installable**: Easy installation and distribution\n\n## Installation\n\n```bash\n# Basic installation\npip install observee-agents\n\n# With optional dependencies\npip install observee-agents[embedding,cloud]\n\n# Development installation\npip install observee-agents[dev]\n```\n\n## Quick Start\n\n### Simple Synchronous Usage (Recommended)\n\n```python\nfrom observee_agents import chat_with_tools\n\nresult = chat_with_tools(\n message=\"Search for recent news about AI developments\",\n provider=\"anthropic\",\n model=\"claude-sonnet-4-20250514\",\n observee_api_key=\"obs_your_key_here\"\n)\n\nprint(\"Response:\", result[\"content\"])\nprint(\"Tools used:\", len(result[\"tool_calls\"]))\n```\n\n### Explore Available Tools\n\n```python\nfrom observee_agents import list_tools, get_tool_info, filter_tools\n\n# List all available tools\ntools = list_tools(observee_api_key=\"obs_your_key_here\")\nprint(f\"Found {len(tools)} tools:\")\nfor tool in tools[:5]: # Show first 5\n print(f\"- {tool['name']}: {tool['description']}\")\n\n# Get detailed info about a specific tool\ntool_info = get_tool_info(\n tool_name=\"youtube_get_transcript\",\n observee_api_key=\"obs_your_key_here\"\n)\nif tool_info:\n print(f\"Tool: {tool_info['name']}\")\n print(f\"Description: {tool_info['description']}\")\n\n# Find relevant tools for a task\nrelevant_tools = filter_tools(\n query=\"search YouTube videos\",\n max_tools=3,\n observee_api_key=\"obs_your_key_here\"\n)\nfor tool in relevant_tools:\n print(f\"- {tool['name']} (relevance: {tool['relevance_score']})\")\n```\n\n### Execute Tools Directly\n\n```python\nfrom observee_agents import execute_tool\n\n# Execute a tool directly without LLM\nresult = execute_tool(\n tool_name=\"youtube_get_transcript\", \n tool_input={\"video_url\": \"https://youtube.com/watch?v=dQw4w9WgXcQ\"},\n observee_api_key=\"obs_your_key_here\"\n)\nprint(result)\n```\n\n### Streaming Responses\n\n```python\nimport asyncio\nfrom observee_agents import chat_with_tools_stream\n\nasync def stream_example():\n async for chunk in chat_with_tools_stream(\n message=\"What's the weather like today?\",\n provider=\"openai\",\n observee_api_key=\"obs_your_key_here\"\n ):\n if chunk[\"type\"] == \"content\":\n print(chunk[\"content\"], end=\"\", flush=True)\n elif chunk[\"type\"] == \"tool_result\":\n print(f\"\\n[Tool executed: {chunk['tool_name']}]\")\n\nasyncio.run(stream_example())\n```\n\n### Advanced Async Usage\n\n```python\nimport asyncio\nfrom observee_agents import MCPAgent\n\nasync def advanced_example():\n async with MCPAgent(\n provider=\"anthropic\",\n server_url=\"wss://mcp.observee.ai/mcp?client_id=your_id\",\n auth_token=\"obs_your_key_here\"\n ) as agent:\n result = await agent.chat_with_tools(\n message=\"What tools do you have access to?\"\n )\n return result\n\nresult = asyncio.run(advanced_example())\nprint(result[\"content\"])\n```\n\n### OAuth Authentication\n\nThe SDK includes built-in OAuth flows for authenticating with various services:\n\n```python\nfrom observee_agents import call_mcpauth_login, get_available_servers\n\n# Get list of supported authentication servers\nservers = get_available_servers()\nprint(f\"Available servers: {servers['supported_servers']}\")\n\n# Start authentication flow for Gmail\nresponse = call_mcpauth_login(auth_server=\"gmail\")\nprint(f\"Visit this URL to authenticate: {response['url']}\")\n\n# Start authentication flow for Slack with client ID\nresponse = call_mcpauth_login(\n auth_server=\"slack\"\n)\n```\n\n**Supported Services**: Gmail, Google Calendar, Google Docs, Google Drive, Google Sheets, Slack, Notion, Linear, Asana, Outlook, OneDrive, Atlassian, Supabase, Airtable, Discord, and more.\n\n## Configuration\n\n### Environment Variables\n\n```bash\n# Option 1: API Key (Recommended)\nexport OBSERVEE_API_KEY=\"obs_your_key_here\"\nexport OBSERVEE_CLIENT_ID=\"your_client_id\" # Optional\n\n# Option 2: Direct URL\nexport OBSERVEE_URL=\"https://mcp.observee.ai/mcp\"\n\n# LLM Provider Keys\nexport ANTHROPIC_API_KEY=\"your_anthropic_key\"\nexport OPENAI_API_KEY=\"your_openai_key\" \nexport GOOGLE_API_KEY=\"your_google_key\"\n```\n\n### Function Parameters\n\n```python\nfrom observee_agents import chat_with_tools\n\nresult = chat_with_tools(\n message=\"Your query here\",\n \n # Provider Configuration\n provider=\"anthropic\", # \"anthropic\", \"openai\", \"gemini\"\n model=\"claude-sonnet-4-20250514\", # Auto-detected if not provided\n \n # Authentication (priority: params > env vars)\n observee_api_key=\"obs_your_key\",\n observee_url=\"https://custom.mcp.server/endpoint\",\n client_id=\"your_client_id\",\n \n # Tool Filtering\n enable_filtering=True, # True for filtered tools, False for all tools\n filter_type=\"bm25\", # \"bm25\", \"local_embedding\", \"cloud\"\n max_tools=20, # Maximum tools to filter\n min_score=8.0, # Minimum relevance score\n \n # Performance\n sync_tools=False, # True to clear caches and resync\n \n # Provider-specific args\n temperature=0.7,\n max_tokens=1000\n)\n```\n\n## Examples\n\n## Available Imports\n\n```python\n# Main chat functionality\nfrom observee_agents import chat_with_tools, chat_with_tools_stream\n\n# Tool exploration and management\nfrom observee_agents import list_tools, get_tool_info, filter_tools, execute_tool\n\n# Advanced usage\nfrom observee_agents import MCPAgent\n```\n\n### Multiple Providers\n\n```python\nfrom observee_agents import chat_with_tools\n\n# Anthropic Claude\nresult = chat_with_tools(\n message=\"Analyze this YouTube video\",\n provider=\"anthropic\",\n model=\"claude-sonnet-4-20250514\"\n)\n\n# OpenAI GPT\nresult = chat_with_tools(\n message=\"Search for recent AI papers\", \n provider=\"openai\",\n model=\"gpt-4o\"\n)\n\n# Google Gemini\nresult = chat_with_tools(\n message=\"Help me manage my emails\",\n provider=\"gemini\", \n model=\"gemini-2.5-pro\"\n)\n```\n\n### Tool Filtering Options\n\n```python\nfrom observee_agents import chat_with_tools\n\n# Fast BM25 keyword filtering (default)\nresult = chat_with_tools(\n message=\"Find relevant tools\",\n filter_type=\"bm25\",\n max_tools=5\n)\n\n# Semantic embedding filtering\nresult = chat_with_tools(\n message=\"Find relevant tools\",\n filter_type=\"local_embedding\",\n max_tools=10\n)\n\n# Cloud hybrid search (requires API keys)\nresult = chat_with_tools(\n message=\"Find relevant tools\",\n filter_type=\"cloud\",\n max_tools=15\n)\n\n# No filtering - use all available tools\nresult = chat_with_tools(\n message=\"What can you do?\",\n enable_filtering=False\n)\n```\n\n### Custom Configuration\n\n```python\nfrom observee_agents import chat_with_tools\n\n# Custom Observee server\nresult = chat_with_tools(\n message=\"Custom server query\",\n observee_url=\"https://your-custom-server.com/mcp\",\n client_id=\"custom_client_123\"\n)\n\n# Force cache refresh\nresult = chat_with_tools(\n message=\"Get fresh results\", \n sync_tools=True # Clears caches\n)\n```\n\n## Response Format\n\n```python\n{\n \"content\": \"The AI response text\",\n \"tool_calls\": [\n {\n \"name\": \"tool_name\",\n \"input\": {\"param\": \"value\"}\n }\n ],\n \"tool_results\": [\n {\n \"tool\": \"tool_name\", \n \"result\": \"tool output\"\n }\n ],\n \"filtered_tools_count\": 5,\n \"filtered_tools\": [\"tool1\", \"tool2\", \"tool3\"],\n \"used_filtering\": True\n}\n```\n\n## Available Tools\n\nThe SDK provides access to various MCP tools including:\n\n- **\ud83d\udce7 Gmail**: Email management, search, compose, labels\n- **\ud83c\udfa5 YouTube**: Video transcript retrieval and analysis \n- **\ud83d\udccb Linear**: Project management, issues, comments\n- **\ud83d\udd0d Brave Search**: Web search and local business lookup\n- **And many more...**\n\n## Filter Types\n\n### BM25 Filter (Default)\n- **Speed**: \u26a1 ~1-5ms per query\n- **Best for**: Fast keyword matching, production use\n- **Dependencies**: None (built-in)\n\n### Local Embedding Filter \n- **Speed**: \u26a1 ~10ms per query\n- **Best for**: Semantic search without cloud dependencies\n- **Dependencies**: `fastembed`\n\n### Cloud Filter\n- **Speed**: \ud83d\udc0c ~300-400ms per query \n- **Best for**: Highest quality hybrid search\n- **Dependencies**: `pinecone-client`, `openai`\n- **Requirements**: `PINECONE_API_KEY`, `OPENAI_API_KEY`\n\n## Development\n\n```bash\n# Clone and install in development mode\ngit clone https://github.com/observee-ai/mcp-agent-system.git #coming soon\ncd mcp-agent-system\npip install -e .[dev]\n\n# Run tests\npytest\n\n# Format code\nblack observee_agents/\n```\n\n## License\n\nAll rights reserved. This software is proprietary and confidential. Unauthorized copying, distribution, or use is strictly prohibited.\n\n## Support\n\n- \ud83d\udcd6 [Documentation](https://docs.observee.ai/mcp-agent-system)\n- \ud83d\udc1b [Issue Tracker](https://github.com/observee-ai/mcp-agent-system/issues)\n- \ud83d\udcac [Discord Community](https://discord.gg/jnf8yHWJ)\n- \ud83d\udce7 [Email Support](mailto:contact@observee.ai) \n",
"bugtrack_url": null,
"license": "Proprietary",
"summary": "A Python SDK for MCP tool integration with LLM providers",
"version": "0.0.2",
"project_urls": {
"Bug Reports": "https://github.com/observee-ai/mcp-agent-system/issues",
"Documentation": "https://docs.observee.ai/mcp-agent-system",
"Homepage": "https://github.com/observee-ai/mcp-agent-system",
"Repository": "https://github.com/observee-ai/mcp-agent-system.git"
},
"split_keywords": [
"mcp",
" agent",
" llm",
" anthropic",
" openai",
" gemini",
" tools"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "df9582758c451393260ed05a61870428ae45a781c84fa76d69f456df940b1da3",
"md5": "03691837b657372437604278f73408b5",
"sha256": "17d7a9dbf2a7b9e44663c49d19260285b91af76a6cba4adf1ed6a900ed4f80a3"
},
"downloads": -1,
"filename": "oauth_mcp-0.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "03691837b657372437604278f73408b5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 47949,
"upload_time": "2025-07-15T01:58:58",
"upload_time_iso_8601": "2025-07-15T01:58:58.852024Z",
"url": "https://files.pythonhosted.org/packages/df/95/82758c451393260ed05a61870428ae45a781c84fa76d69f456df940b1da3/oauth_mcp-0.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "d8abd18264d05f4c82bab3067dea934a35ab7260c6906ceb0f434263cfd03999",
"md5": "89cec052375b97be4d1788c56930d20d",
"sha256": "d328bcb4da46ab53c5c4c5529ca4a16858bb08ed289e833909c442cb8e9a0692"
},
"downloads": -1,
"filename": "oauth_mcp-0.0.2.tar.gz",
"has_sig": false,
"md5_digest": "89cec052375b97be4d1788c56930d20d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 38674,
"upload_time": "2025-07-15T01:59:00",
"upload_time_iso_8601": "2025-07-15T01:59:00.273948Z",
"url": "https://files.pythonhosted.org/packages/d8/ab/d18264d05f4c82bab3067dea934a35ab7260c6906ceb0f434263cfd03999/oauth_mcp-0.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-15 01:59:00",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "observee-ai",
"github_project": "mcp-agent-system",
"github_not_found": true,
"lcname": "oauth-mcp"
}