agentic-auth


Nameagentic-auth JSON
Version 0.0.1 PyPI version JSON
download
home_pageNone
SummaryA Python SDK for MCP tool integration with LLM providers
upload_time2025-07-28 07:30:44
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseProprietary
keywords mcp agent llm anthropic openai gemini tools
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Observee Agents

A Python SDK for seamless integration of MCP (Model Context Protocol) tools with multiple LLM providers including Anthropic Claude, OpenAI GPT, and Google Gemini.

**Configure as many MCP servers/tools as you need at [observee.ai](https://observee.ai)**

## Features

- šŸ¤– **Multi-Provider Support**: Works with Anthropic, OpenAI, and Gemini
- šŸ”§ **Smart Tool Filtering**: BM25, local embeddings, and cloud-based filtering
- ⚔ **Fast Performance**: Intelligent caching and optimization
- šŸ”‘ **Flexible Authentication**: URL-based or API key authentication
- šŸ” **OAuth Integration**: Built-in authentication flows for Gmail, Slack, Notion, and 15+ services
- šŸŽÆ **Easy Integration**: Simple sync/async API
- šŸ“” **Streaming Support**: Real-time streaming responses for Anthropic, OpenAI, and Gemini
- šŸ—Øļø **Conversation History**: Persistent memory across chat sessions
- šŸŽÆ **Custom System Prompts**: Personalized AI behavior and expertise
- šŸ“¦ **Pip Installable**: Easy installation and distribution

## Installation

```bash
# Basic installation
pip install observee-agents

# With optional dependencies
pip install observee-agents[embedding,cloud]

# Development installation
pip install observee-agents[dev]
```

## Quick Start

### Simple Synchronous Usage (Recommended)

```python
from observee_agents import chat_with_tools

result = chat_with_tools(
    message="Search for recent news about AI developments",
    provider="anthropic",
    model="claude-sonnet-4-20250514",
    observee_api_key="obs_your_key_here"
)

print("Response:", result["content"])
print("Tools used:", len(result["tool_calls"]))
```

### Explore Available Tools

```python
from observee_agents import list_tools, get_tool_info, filter_tools

# List all available tools
tools = list_tools(observee_api_key="obs_your_key_here")
print(f"Found {len(tools)} tools:")
for tool in tools[:5]:  # Show first 5
    print(f"- {tool['name']}: {tool['description']}")

# Get detailed info about a specific tool
tool_info = get_tool_info(
    tool_name="youtube_get_transcript",
    observee_api_key="obs_your_key_here"
)
if tool_info:
    print(f"Tool: {tool_info['name']}")
    print(f"Description: {tool_info['description']}")

# Find relevant tools for a task
relevant_tools = filter_tools(
    query="search YouTube videos",
    max_tools=3,
    observee_api_key="obs_your_key_here"
)
for tool in relevant_tools:
    print(f"- {tool['name']} (relevance: {tool['relevance_score']})")
```

### Execute Tools Directly

```python
from observee_agents import execute_tool

# Execute a tool directly without LLM
result = execute_tool(
    tool_name="youtube_get_transcript", 
    tool_input={"video_url": "https://youtube.com/watch?v=dQw4w9WgXcQ"},
    observee_api_key="obs_your_key_here"
)
print(result)
```

### Custom Tools

You can extend the SDK with your own custom tools alongside the MCP tools:

```python
from observee_agents import chat_with_tools_stream
import asyncio

# Define custom tool handler
async def custom_tool_handler(tool_name: str, tool_input: dict) -> str:
    """Handle custom tool executions"""
    if tool_name == "add_numbers":
        return str(tool_input.get("a", 0) + tool_input.get("b", 0))
    elif tool_name == "multiply_numbers":
        return str(tool_input.get("a", 0) * tool_input.get("b", 0))
    elif tool_name == "get_time":
        from datetime import datetime
        return datetime.now().strftime("%I:%M %p")
    else:
        return f"Unknown tool: {tool_name}"

# Define custom tools in OpenAI format
custom_tools = [
    {
        "type": "function",
        "function": {
            "name": "add_numbers",
            "description": "Add two numbers together",
            "parameters": {
                "type": "object",
                "properties": {
                    "a": {"type": "number", "description": "First number"},
                    "b": {"type": "number", "description": "Second number"}
                },
                "required": ["a", "b"]
            }
        }
    },
    {
        "type": "function",
        "function": {
            "name": "get_time",
            "description": "Get the current time",
            "parameters": {
                "type": "object",
                "properties": {}
            }
        }
    }
]

# Use custom tools with MCP tools
async def custom_example():
    async for chunk in chat_with_tools_stream(
        message="What's 5 + 3? Also, what time is it?",
        provider="openai",
        custom_tools=custom_tools,
        custom_tool_handler=custom_tool_handler,
        observee_api_key="obs_your_key_here"
    ):
        if chunk["type"] == "content":
            print(chunk["content"], end="", flush=True)
        elif chunk["type"] == "tool_result":
            print(f"\nšŸ”§ [Tool: {chunk['tool_name']} = {chunk['result']}]")

asyncio.run(custom_example())
```

Custom tools work seamlessly with all providers (Anthropic, OpenAI, Gemini) and can be combined with MCP tools for enhanced functionality.

### Streaming Responses

```python
import asyncio
from observee_agents import chat_with_tools_stream

async def stream_example():
    async for chunk in chat_with_tools_stream(
        message="What's the weather like today?",
        provider="openai",
        observee_api_key="obs_your_key_here"
    ):
        if chunk["type"] == "content":
            print(chunk["content"], end="", flush=True)
        elif chunk["type"] == "tool_result":
            print(f"\n[Tool executed: {chunk['tool_name']}]")

asyncio.run(stream_example())
```

### šŸ†• Conversational AI with Memory

```python
import asyncio
from observee_agents import chat_with_tools_stream, get_conversation_history

async def conversation_example():
    # Create a specialized assistant with conversation memory
    session_id = "my_email_assistant"
    custom_prompt = "You are a professional email assistant. Be concise and helpful."
    
    # First message with custom system prompt
    async for chunk in chat_with_tools_stream(
        message="Search for emails about meetings",
        provider="anthropic",
        session_id=session_id,  # šŸ†• Enables conversation memory
        system_prompt=custom_prompt,  # šŸ†• Custom AI behavior
        observee_api_key="obs_your_key_here"
    ):
        if chunk["type"] == "content":
            print(chunk["content"], end="", flush=True)
    
    print("\n" + "="*40 + "\n")
    
    # Follow-up - remembers previous context!
    async for chunk in chat_with_tools_stream(
        message="What was the subject of the first meeting?",
        session_id=session_id,  # Same session = memory!
        observee_api_key="obs_your_key_here"
    ):
        if chunk["type"] == "content":
            print(chunk["content"], end="", flush=True)
    
    # Check conversation history
    history = get_conversation_history(session_id)
    print(f"\nšŸ“Š Conversation has {len(history)} messages")

asyncio.run(conversation_example())
```

### Advanced Async Usage

```python
import asyncio
from observee_agents import MCPAgent

async def advanced_example():
    async with MCPAgent(
        provider="anthropic",
        server_url="wss://mcp.observee.ai/mcp?client_id=your_id",
        auth_token="obs_your_key_here"
    ) as agent:
        result = await agent.chat_with_tools(
            message="What tools do you have access to?"
        )
        return result

result = asyncio.run(advanced_example())
print(result["content"])
```

### OAuth Authentication

The SDK includes built-in OAuth flows for authenticating with various services:

```python
from observee_agents import call_mcpauth_login, get_available_servers

# Get list of supported authentication servers
servers = get_available_servers()
print(f"Available servers: {servers['supported_servers']}")

# Start authentication flow for Gmail
response = call_mcpauth_login(auth_server="gmail")
print(f"Visit this URL to authenticate: {response['url']}")

# Start authentication flow for Slack with client ID
response = call_mcpauth_login(
    auth_server="slack"
)
```

**Supported Services**: Gmail, Google Calendar, Google Docs, Google Drive, Google Sheets, Slack, Notion, Linear, Asana, Outlook, OneDrive, Atlassian, Supabase, Airtable, Discord, and more.

## Configuration

### Environment Variables

```bash
# Option 1: API Key (Recommended)
export OBSERVEE_API_KEY="obs_your_key_here"
export OBSERVEE_CLIENT_ID="your_client_id"  # Optional

# Option 2: Direct URL
export OBSERVEE_URL="https://mcp.observee.ai/mcp"

# LLM Provider Keys
export ANTHROPIC_API_KEY="your_anthropic_key"
export OPENAI_API_KEY="your_openai_key" 
export GOOGLE_API_KEY="your_google_key"
```

### Function Parameters

```python
from observee_agents import chat_with_tools

result = chat_with_tools(
    message="Your query here",
    
    # Provider Configuration
    provider="anthropic",  # "anthropic", "openai", "gemini"
    model="claude-sonnet-4-20250514",  # Auto-detected if not provided
    
    # Authentication (priority: params > env vars)
    observee_api_key="obs_your_key",
    observee_url="https://custom.mcp.server/endpoint",
    client_id="your_client_id",
    
    # Tool Filtering
    enable_filtering=True,  # True for filtered tools, False for all tools
    filter_type="bm25",     # "bm25", "local_embedding", "cloud"
    max_tools=20,           # Maximum tools to filter
    min_score=8.0,          # Minimum relevance score
    
    # Performance
    sync_tools=False,       # True to clear caches and resync
    
    # šŸ†• Conversation features
    session_id="my_assistant",  # Enable conversation memory
    system_prompt="You are a helpful expert...",  # Custom AI behavior
    
    # Provider-specific args
    temperature=0.7,
    max_tokens=1000
)
```

## Examples

## Available Imports

```python
# Main chat functionality
from observee_agents import chat_with_tools, chat_with_tools_stream

# Tool exploration and management
from observee_agents import list_tools, get_tool_info, filter_tools, execute_tool

# šŸ†• Conversation management
from observee_agents import (
    get_conversation_history, 
    reset_conversation_history,
    list_sessions, 
    clear_session
)

# Advanced usage
from observee_agents import MCPAgent
```

### Multiple Providers

```python
from observee_agents import chat_with_tools

# Anthropic Claude
result = chat_with_tools(
    message="Analyze this YouTube video",
    provider="anthropic",
    model="claude-sonnet-4-20250514"
)

# OpenAI GPT
result = chat_with_tools(
    message="Search for recent AI papers", 
    provider="openai",
    model="gpt-4o"
)

# Google Gemini
result = chat_with_tools(
    message="Help me manage my emails",
    provider="gemini", 
    model="gemini-2.5-pro"
)
```

### šŸ†• Specialized AI Assistants

```python
from observee_agents import chat_with_tools_stream

# Email management specialist
async for chunk in chat_with_tools_stream(
    message="Help me organize my inbox",
    session_id="email_bot",
    system_prompt="You are an email productivity expert. Focus on organization and efficiency.",
    provider="anthropic"
):
    # Handle streaming response...

# Data analysis specialist  
async for chunk in chat_with_tools_stream(
    message="Analyze the latest sales data",
    session_id="data_bot", 
    system_prompt="You are a data scientist. Provide technical insights and actionable recommendations.",
    provider="openai"
):
    # Handle streaming response...

# Content creation specialist
async for chunk in chat_with_tools_stream(
    message="Create a YouTube video summary",
    session_id="content_bot",
    system_prompt="You are a content strategist. Focus on engagement and storytelling.",
    provider="gemini"
):
    # Handle streaming response...
```

### Tool Filtering Options

```python
from observee_agents import chat_with_tools

# Fast BM25 keyword filtering (default)
result = chat_with_tools(
    message="Find relevant tools",
    filter_type="bm25",
    max_tools=5
)

# Semantic embedding filtering
result = chat_with_tools(
    message="Find relevant tools",
    filter_type="local_embedding",
    max_tools=10
)

# Cloud hybrid search (requires API keys)
result = chat_with_tools(
    message="Find relevant tools",
    filter_type="cloud",
    max_tools=15
)

# No filtering - use all available tools
result = chat_with_tools(
    message="What can you do?",
    enable_filtering=False
)
```

### Custom Configuration

```python
from observee_agents import chat_with_tools

# Custom Observee server
result = chat_with_tools(
    message="Custom server query",
    observee_url="https://your-custom-server.com/mcp",
    client_id="custom_client_123"
)

# Force cache refresh
result = chat_with_tools(
    message="Get fresh results", 
    sync_tools=True  # Clears caches
)
```

## Response Format

```python
{
    "content": "The AI response text",
    "tool_calls": [
        {
            "name": "tool_name",
            "input": {"param": "value"}
        }
    ],
    "tool_results": [
        {
            "tool": "tool_name", 
            "result": "tool output"
        }
    ],
    "filtered_tools_count": 5,
    "filtered_tools": ["tool1", "tool2", "tool3"],
    "used_filtering": True
}
```

## Available Tools

The SDK provides access to various MCP tools including:

- **šŸ“§ Gmail**: Email management, search, compose, labels
- **šŸŽ„ YouTube**: Video transcript retrieval and analysis  
- **šŸ“‹ Linear**: Project management, issues, comments
- **šŸ” Brave Search**: Web search and local business lookup
- **And many more...**

## Filter Types

### BM25 Filter (Default)
- **Speed**: ⚔ ~1-5ms per query
- **Best for**: Fast keyword matching, production use
- **Dependencies**: None (built-in)

### Local Embedding Filter  
- **Speed**: ⚔ ~10ms per query
- **Best for**: Semantic search without cloud dependencies
- **Dependencies**: `fastembed`

### Cloud Filter
- **Speed**: 🐌 ~300-400ms per query  
- **Best for**: Highest quality hybrid search
- **Dependencies**: `pinecone-client`, `openai`
- **Requirements**: `PINECONE_API_KEY`, `OPENAI_API_KEY`

## Development

```bash
# Clone and install in development mode
git clone https://github.com/observee-ai/mcp-agent-system.git #coming soon
cd mcp-agent-system
pip install -e .[dev]

# Run tests
pytest

# Format code
black observee_agents/
```

## License

All rights reserved. This software is proprietary and confidential. Unauthorized copying, distribution, or use is strictly prohibited.

## Support

- šŸ“– [Documentation](https://docs.observee.ai/mcp-agent-system)
- šŸ› [Issue Tracker](https://github.com/observee-ai/mcp-agent-system/issues)
- šŸ’¬ [Discord Community](https://discord.gg/jnf8yHWJ)
- šŸ“§ [Email Support](mailto:contact@observee.ai) 

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "agentic-auth",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "Observee <contact@observee.ai>",
    "keywords": "mcp, agent, llm, anthropic, openai, gemini, tools",
    "author": null,
    "author_email": "Observee <contact@observee.ai>",
    "download_url": "https://files.pythonhosted.org/packages/04/ec/a76fd737e85f71b03f46ff692bcccb3fa89c9f183400a19376e3a23d9908/agentic_auth-0.0.1.tar.gz",
    "platform": null,
    "description": "# Observee Agents\n\nA Python SDK for seamless integration of MCP (Model Context Protocol) tools with multiple LLM providers including Anthropic Claude, OpenAI GPT, and Google Gemini.\n\n**Configure as many MCP servers/tools as you need at [observee.ai](https://observee.ai)**\n\n## Features\n\n- \ud83e\udd16 **Multi-Provider Support**: Works with Anthropic, OpenAI, and Gemini\n- \ud83d\udd27 **Smart Tool Filtering**: BM25, local embeddings, and cloud-based filtering\n- \u26a1 **Fast Performance**: Intelligent caching and optimization\n- \ud83d\udd11 **Flexible Authentication**: URL-based or API key authentication\n- \ud83d\udd10 **OAuth Integration**: Built-in authentication flows for Gmail, Slack, Notion, and 15+ services\n- \ud83c\udfaf **Easy Integration**: Simple sync/async API\n- \ud83d\udce1 **Streaming Support**: Real-time streaming responses for Anthropic, OpenAI, and Gemini\n- \ud83d\udde8\ufe0f **Conversation History**: Persistent memory across chat sessions\n- \ud83c\udfaf **Custom System Prompts**: Personalized AI behavior and expertise\n- \ud83d\udce6 **Pip Installable**: Easy installation and distribution\n\n## Installation\n\n```bash\n# Basic installation\npip install observee-agents\n\n# With optional dependencies\npip install observee-agents[embedding,cloud]\n\n# Development installation\npip install observee-agents[dev]\n```\n\n## Quick Start\n\n### Simple Synchronous Usage (Recommended)\n\n```python\nfrom observee_agents import chat_with_tools\n\nresult = chat_with_tools(\n    message=\"Search for recent news about AI developments\",\n    provider=\"anthropic\",\n    model=\"claude-sonnet-4-20250514\",\n    observee_api_key=\"obs_your_key_here\"\n)\n\nprint(\"Response:\", result[\"content\"])\nprint(\"Tools used:\", len(result[\"tool_calls\"]))\n```\n\n### Explore Available Tools\n\n```python\nfrom observee_agents import list_tools, get_tool_info, filter_tools\n\n# List all available tools\ntools = list_tools(observee_api_key=\"obs_your_key_here\")\nprint(f\"Found {len(tools)} tools:\")\nfor tool in tools[:5]:  # Show first 5\n    print(f\"- {tool['name']}: {tool['description']}\")\n\n# Get detailed info about a specific tool\ntool_info = get_tool_info(\n    tool_name=\"youtube_get_transcript\",\n    observee_api_key=\"obs_your_key_here\"\n)\nif tool_info:\n    print(f\"Tool: {tool_info['name']}\")\n    print(f\"Description: {tool_info['description']}\")\n\n# Find relevant tools for a task\nrelevant_tools = filter_tools(\n    query=\"search YouTube videos\",\n    max_tools=3,\n    observee_api_key=\"obs_your_key_here\"\n)\nfor tool in relevant_tools:\n    print(f\"- {tool['name']} (relevance: {tool['relevance_score']})\")\n```\n\n### Execute Tools Directly\n\n```python\nfrom observee_agents import execute_tool\n\n# Execute a tool directly without LLM\nresult = execute_tool(\n    tool_name=\"youtube_get_transcript\", \n    tool_input={\"video_url\": \"https://youtube.com/watch?v=dQw4w9WgXcQ\"},\n    observee_api_key=\"obs_your_key_here\"\n)\nprint(result)\n```\n\n### Custom Tools\n\nYou can extend the SDK with your own custom tools alongside the MCP tools:\n\n```python\nfrom observee_agents import chat_with_tools_stream\nimport asyncio\n\n# Define custom tool handler\nasync def custom_tool_handler(tool_name: str, tool_input: dict) -> str:\n    \"\"\"Handle custom tool executions\"\"\"\n    if tool_name == \"add_numbers\":\n        return str(tool_input.get(\"a\", 0) + tool_input.get(\"b\", 0))\n    elif tool_name == \"multiply_numbers\":\n        return str(tool_input.get(\"a\", 0) * tool_input.get(\"b\", 0))\n    elif tool_name == \"get_time\":\n        from datetime import datetime\n        return datetime.now().strftime(\"%I:%M %p\")\n    else:\n        return f\"Unknown tool: {tool_name}\"\n\n# Define custom tools in OpenAI format\ncustom_tools = [\n    {\n        \"type\": \"function\",\n        \"function\": {\n            \"name\": \"add_numbers\",\n            \"description\": \"Add two numbers together\",\n            \"parameters\": {\n                \"type\": \"object\",\n                \"properties\": {\n                    \"a\": {\"type\": \"number\", \"description\": \"First number\"},\n                    \"b\": {\"type\": \"number\", \"description\": \"Second number\"}\n                },\n                \"required\": [\"a\", \"b\"]\n            }\n        }\n    },\n    {\n        \"type\": \"function\",\n        \"function\": {\n            \"name\": \"get_time\",\n            \"description\": \"Get the current time\",\n            \"parameters\": {\n                \"type\": \"object\",\n                \"properties\": {}\n            }\n        }\n    }\n]\n\n# Use custom tools with MCP tools\nasync def custom_example():\n    async for chunk in chat_with_tools_stream(\n        message=\"What's 5 + 3? Also, what time is it?\",\n        provider=\"openai\",\n        custom_tools=custom_tools,\n        custom_tool_handler=custom_tool_handler,\n        observee_api_key=\"obs_your_key_here\"\n    ):\n        if chunk[\"type\"] == \"content\":\n            print(chunk[\"content\"], end=\"\", flush=True)\n        elif chunk[\"type\"] == \"tool_result\":\n            print(f\"\\n\ud83d\udd27 [Tool: {chunk['tool_name']} = {chunk['result']}]\")\n\nasyncio.run(custom_example())\n```\n\nCustom tools work seamlessly with all providers (Anthropic, OpenAI, Gemini) and can be combined with MCP tools for enhanced functionality.\n\n### Streaming Responses\n\n```python\nimport asyncio\nfrom observee_agents import chat_with_tools_stream\n\nasync def stream_example():\n    async for chunk in chat_with_tools_stream(\n        message=\"What's the weather like today?\",\n        provider=\"openai\",\n        observee_api_key=\"obs_your_key_here\"\n    ):\n        if chunk[\"type\"] == \"content\":\n            print(chunk[\"content\"], end=\"\", flush=True)\n        elif chunk[\"type\"] == \"tool_result\":\n            print(f\"\\n[Tool executed: {chunk['tool_name']}]\")\n\nasyncio.run(stream_example())\n```\n\n### \ud83c\udd95 Conversational AI with Memory\n\n```python\nimport asyncio\nfrom observee_agents import chat_with_tools_stream, get_conversation_history\n\nasync def conversation_example():\n    # Create a specialized assistant with conversation memory\n    session_id = \"my_email_assistant\"\n    custom_prompt = \"You are a professional email assistant. Be concise and helpful.\"\n    \n    # First message with custom system prompt\n    async for chunk in chat_with_tools_stream(\n        message=\"Search for emails about meetings\",\n        provider=\"anthropic\",\n        session_id=session_id,  # \ud83c\udd95 Enables conversation memory\n        system_prompt=custom_prompt,  # \ud83c\udd95 Custom AI behavior\n        observee_api_key=\"obs_your_key_here\"\n    ):\n        if chunk[\"type\"] == \"content\":\n            print(chunk[\"content\"], end=\"\", flush=True)\n    \n    print(\"\\n\" + \"=\"*40 + \"\\n\")\n    \n    # Follow-up - remembers previous context!\n    async for chunk in chat_with_tools_stream(\n        message=\"What was the subject of the first meeting?\",\n        session_id=session_id,  # Same session = memory!\n        observee_api_key=\"obs_your_key_here\"\n    ):\n        if chunk[\"type\"] == \"content\":\n            print(chunk[\"content\"], end=\"\", flush=True)\n    \n    # Check conversation history\n    history = get_conversation_history(session_id)\n    print(f\"\\n\ud83d\udcca Conversation has {len(history)} messages\")\n\nasyncio.run(conversation_example())\n```\n\n### Advanced Async Usage\n\n```python\nimport asyncio\nfrom observee_agents import MCPAgent\n\nasync def advanced_example():\n    async with MCPAgent(\n        provider=\"anthropic\",\n        server_url=\"wss://mcp.observee.ai/mcp?client_id=your_id\",\n        auth_token=\"obs_your_key_here\"\n    ) as agent:\n        result = await agent.chat_with_tools(\n            message=\"What tools do you have access to?\"\n        )\n        return result\n\nresult = asyncio.run(advanced_example())\nprint(result[\"content\"])\n```\n\n### OAuth Authentication\n\nThe SDK includes built-in OAuth flows for authenticating with various services:\n\n```python\nfrom observee_agents import call_mcpauth_login, get_available_servers\n\n# Get list of supported authentication servers\nservers = get_available_servers()\nprint(f\"Available servers: {servers['supported_servers']}\")\n\n# Start authentication flow for Gmail\nresponse = call_mcpauth_login(auth_server=\"gmail\")\nprint(f\"Visit this URL to authenticate: {response['url']}\")\n\n# Start authentication flow for Slack with client ID\nresponse = call_mcpauth_login(\n    auth_server=\"slack\"\n)\n```\n\n**Supported Services**: Gmail, Google Calendar, Google Docs, Google Drive, Google Sheets, Slack, Notion, Linear, Asana, Outlook, OneDrive, Atlassian, Supabase, Airtable, Discord, and more.\n\n## Configuration\n\n### Environment Variables\n\n```bash\n# Option 1: API Key (Recommended)\nexport OBSERVEE_API_KEY=\"obs_your_key_here\"\nexport OBSERVEE_CLIENT_ID=\"your_client_id\"  # Optional\n\n# Option 2: Direct URL\nexport OBSERVEE_URL=\"https://mcp.observee.ai/mcp\"\n\n# LLM Provider Keys\nexport ANTHROPIC_API_KEY=\"your_anthropic_key\"\nexport OPENAI_API_KEY=\"your_openai_key\" \nexport GOOGLE_API_KEY=\"your_google_key\"\n```\n\n### Function Parameters\n\n```python\nfrom observee_agents import chat_with_tools\n\nresult = chat_with_tools(\n    message=\"Your query here\",\n    \n    # Provider Configuration\n    provider=\"anthropic\",  # \"anthropic\", \"openai\", \"gemini\"\n    model=\"claude-sonnet-4-20250514\",  # Auto-detected if not provided\n    \n    # Authentication (priority: params > env vars)\n    observee_api_key=\"obs_your_key\",\n    observee_url=\"https://custom.mcp.server/endpoint\",\n    client_id=\"your_client_id\",\n    \n    # Tool Filtering\n    enable_filtering=True,  # True for filtered tools, False for all tools\n    filter_type=\"bm25\",     # \"bm25\", \"local_embedding\", \"cloud\"\n    max_tools=20,           # Maximum tools to filter\n    min_score=8.0,          # Minimum relevance score\n    \n    # Performance\n    sync_tools=False,       # True to clear caches and resync\n    \n    # \ud83c\udd95 Conversation features\n    session_id=\"my_assistant\",  # Enable conversation memory\n    system_prompt=\"You are a helpful expert...\",  # Custom AI behavior\n    \n    # Provider-specific args\n    temperature=0.7,\n    max_tokens=1000\n)\n```\n\n## Examples\n\n## Available Imports\n\n```python\n# Main chat functionality\nfrom observee_agents import chat_with_tools, chat_with_tools_stream\n\n# Tool exploration and management\nfrom observee_agents import list_tools, get_tool_info, filter_tools, execute_tool\n\n# \ud83c\udd95 Conversation management\nfrom observee_agents import (\n    get_conversation_history, \n    reset_conversation_history,\n    list_sessions, \n    clear_session\n)\n\n# Advanced usage\nfrom observee_agents import MCPAgent\n```\n\n### Multiple Providers\n\n```python\nfrom observee_agents import chat_with_tools\n\n# Anthropic Claude\nresult = chat_with_tools(\n    message=\"Analyze this YouTube video\",\n    provider=\"anthropic\",\n    model=\"claude-sonnet-4-20250514\"\n)\n\n# OpenAI GPT\nresult = chat_with_tools(\n    message=\"Search for recent AI papers\", \n    provider=\"openai\",\n    model=\"gpt-4o\"\n)\n\n# Google Gemini\nresult = chat_with_tools(\n    message=\"Help me manage my emails\",\n    provider=\"gemini\", \n    model=\"gemini-2.5-pro\"\n)\n```\n\n### \ud83c\udd95 Specialized AI Assistants\n\n```python\nfrom observee_agents import chat_with_tools_stream\n\n# Email management specialist\nasync for chunk in chat_with_tools_stream(\n    message=\"Help me organize my inbox\",\n    session_id=\"email_bot\",\n    system_prompt=\"You are an email productivity expert. Focus on organization and efficiency.\",\n    provider=\"anthropic\"\n):\n    # Handle streaming response...\n\n# Data analysis specialist  \nasync for chunk in chat_with_tools_stream(\n    message=\"Analyze the latest sales data\",\n    session_id=\"data_bot\", \n    system_prompt=\"You are a data scientist. Provide technical insights and actionable recommendations.\",\n    provider=\"openai\"\n):\n    # Handle streaming response...\n\n# Content creation specialist\nasync for chunk in chat_with_tools_stream(\n    message=\"Create a YouTube video summary\",\n    session_id=\"content_bot\",\n    system_prompt=\"You are a content strategist. Focus on engagement and storytelling.\",\n    provider=\"gemini\"\n):\n    # Handle streaming response...\n```\n\n### Tool Filtering Options\n\n```python\nfrom observee_agents import chat_with_tools\n\n# Fast BM25 keyword filtering (default)\nresult = chat_with_tools(\n    message=\"Find relevant tools\",\n    filter_type=\"bm25\",\n    max_tools=5\n)\n\n# Semantic embedding filtering\nresult = chat_with_tools(\n    message=\"Find relevant tools\",\n    filter_type=\"local_embedding\",\n    max_tools=10\n)\n\n# Cloud hybrid search (requires API keys)\nresult = chat_with_tools(\n    message=\"Find relevant tools\",\n    filter_type=\"cloud\",\n    max_tools=15\n)\n\n# No filtering - use all available tools\nresult = chat_with_tools(\n    message=\"What can you do?\",\n    enable_filtering=False\n)\n```\n\n### Custom Configuration\n\n```python\nfrom observee_agents import chat_with_tools\n\n# Custom Observee server\nresult = chat_with_tools(\n    message=\"Custom server query\",\n    observee_url=\"https://your-custom-server.com/mcp\",\n    client_id=\"custom_client_123\"\n)\n\n# Force cache refresh\nresult = chat_with_tools(\n    message=\"Get fresh results\", \n    sync_tools=True  # Clears caches\n)\n```\n\n## Response Format\n\n```python\n{\n    \"content\": \"The AI response text\",\n    \"tool_calls\": [\n        {\n            \"name\": \"tool_name\",\n            \"input\": {\"param\": \"value\"}\n        }\n    ],\n    \"tool_results\": [\n        {\n            \"tool\": \"tool_name\", \n            \"result\": \"tool output\"\n        }\n    ],\n    \"filtered_tools_count\": 5,\n    \"filtered_tools\": [\"tool1\", \"tool2\", \"tool3\"],\n    \"used_filtering\": True\n}\n```\n\n## Available Tools\n\nThe SDK provides access to various MCP tools including:\n\n- **\ud83d\udce7 Gmail**: Email management, search, compose, labels\n- **\ud83c\udfa5 YouTube**: Video transcript retrieval and analysis  \n- **\ud83d\udccb Linear**: Project management, issues, comments\n- **\ud83d\udd0d Brave Search**: Web search and local business lookup\n- **And many more...**\n\n## Filter Types\n\n### BM25 Filter (Default)\n- **Speed**: \u26a1 ~1-5ms per query\n- **Best for**: Fast keyword matching, production use\n- **Dependencies**: None (built-in)\n\n### Local Embedding Filter  \n- **Speed**: \u26a1 ~10ms per query\n- **Best for**: Semantic search without cloud dependencies\n- **Dependencies**: `fastembed`\n\n### Cloud Filter\n- **Speed**: \ud83d\udc0c ~300-400ms per query  \n- **Best for**: Highest quality hybrid search\n- **Dependencies**: `pinecone-client`, `openai`\n- **Requirements**: `PINECONE_API_KEY`, `OPENAI_API_KEY`\n\n## Development\n\n```bash\n# Clone and install in development mode\ngit clone https://github.com/observee-ai/mcp-agent-system.git #coming soon\ncd mcp-agent-system\npip install -e .[dev]\n\n# Run tests\npytest\n\n# Format code\nblack observee_agents/\n```\n\n## License\n\nAll rights reserved. This software is proprietary and confidential. Unauthorized copying, distribution, or use is strictly prohibited.\n\n## Support\n\n- \ud83d\udcd6 [Documentation](https://docs.observee.ai/mcp-agent-system)\n- \ud83d\udc1b [Issue Tracker](https://github.com/observee-ai/mcp-agent-system/issues)\n- \ud83d\udcac [Discord Community](https://discord.gg/jnf8yHWJ)\n- \ud83d\udce7 [Email Support](mailto:contact@observee.ai) \n",
    "bugtrack_url": null,
    "license": "Proprietary",
    "summary": "A Python SDK for MCP tool integration with LLM providers",
    "version": "0.0.1",
    "project_urls": {
        "Bug Reports": "https://github.com/observee-ai/mcp-agent-system/issues",
        "Documentation": "https://docs.observee.ai/mcp-agent-system",
        "Homepage": "https://github.com/observee-ai/mcp-agent-system",
        "Repository": "https://github.com/observee-ai/mcp-agent-system.git"
    },
    "split_keywords": [
        "mcp",
        " agent",
        " llm",
        " anthropic",
        " openai",
        " gemini",
        " tools"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "3060de37c999c1d4d6926e00ef9ced8604b62307e66c467c098c5b96102f2812",
                "md5": "400ca42a8bdaa2e257c48b3cfe79a4c2",
                "sha256": "96cd9069b04860b5e8bcb1250e3c2d6deac92bd615920ac35dc9b73fe2bef97d"
            },
            "downloads": -1,
            "filename": "agentic_auth-0.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "400ca42a8bdaa2e257c48b3cfe79a4c2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 76050,
            "upload_time": "2025-07-28T07:30:43",
            "upload_time_iso_8601": "2025-07-28T07:30:43.730589Z",
            "url": "https://files.pythonhosted.org/packages/30/60/de37c999c1d4d6926e00ef9ced8604b62307e66c467c098c5b96102f2812/agentic_auth-0.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "04eca76fd737e85f71b03f46ff692bcccb3fa89c9f183400a19376e3a23d9908",
                "md5": "8d235b9a8c7956d692d0723a86096ff7",
                "sha256": "3303304f5858f2d212b0e0407cff21f984358ff9e170242a13524521dce1899f"
            },
            "downloads": -1,
            "filename": "agentic_auth-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "8d235b9a8c7956d692d0723a86096ff7",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 59337,
            "upload_time": "2025-07-28T07:30:44",
            "upload_time_iso_8601": "2025-07-28T07:30:44.848231Z",
            "url": "https://files.pythonhosted.org/packages/04/ec/a76fd737e85f71b03f46ff692bcccb3fa89c9f183400a19376e3a23d9908/agentic_auth-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-28 07:30:44",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "observee-ai",
    "github_project": "mcp-agent-system",
    "github_not_found": true,
    "lcname": "agentic-auth"
}
        
Elapsed time: 0.67825s