jl-ecms-client


Namejl-ecms-client JSON
Version 0.3.0 PyPI version JSON
download
home_pagehttps://github.com/Mirix-AI/MIRIX
SummaryIntuit ECMS Client - Lightweight Python client for Intuit ECMS server
upload_time2025-11-11 22:20:42
maintainerNone
docs_urlNone
authorMirix AI
requires_python>=3.10
licenseApache License 2.0
keywords ai memory agent llm assistant client api
VCS
bugtrack_url
requirements pytz numpy pandas openpyxl Markdown Pillow scikit-image openai tiktoken google-genai python-dotenv demjson3 pathvalidate docstring_parser sqlalchemy pydantic-settings jinja2 humps composio colorama anthropic httpx_sse rapidfuzz rank-bm25 psutil llama_index llama-index-embeddings-google-genai fastapi uvicorn pydub python-multipart opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp opentelemetry-instrumentation-requests SpeechRecognition ffmpeg pg8000 pgvector json_repair rich psycopg2-binary anyio mcp google-auth google-auth-oauthlib google-auth-httplib2 google-api-python-client ruff nodeenv pyright pytest
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ![Mirix Logo](https://github.com/RenKoya1/MIRIX/raw/main/assets/logo.png)

## MIRIX - Multi-Agent Personal Assistant with an Advanced Memory System

Your personal AI that builds memory through screen observation and natural conversation

| 🌐 [Website](https://mirix.io) | 📚 [Documentation](https://docs.mirix.io) | 📄 [Paper](https://arxiv.org/abs/2507.07957) | 💬 [Discord](https://discord.gg/S6CeHNrJ) 
<!-- | [Twitter/X](https://twitter.com/mirix_ai) | [Discord](https://discord.gg/S6CeHNrJ) | -->

---

### Key Features 🔥

- **Multi-Agent Memory System:** Six specialized memory components (Core, Episodic, Semantic, Procedural, Resource, Knowledge Vault) managed by dedicated agents
- **Screen Activity Tracking:** Continuous visual data capture and intelligent consolidation into structured memories  
- **Privacy-First Design:** All long-term data stored locally with user-controlled privacy settings
- **Advanced Search:** PostgreSQL-native BM25 full-text search with vector similarity support
- **Multi-Modal Input:** Text, images, voice, and screen captures processed seamlessly

### Quick Start
**End-Users**: For end-users who want to build your own memory using MIRIX, please checkout the quick installation guide [here](https://docs.mirix.io/getting-started/installation/#quick-installation-dmg).

**Developers**: For users who want to apply our memory system as the backend, please check out our [Backend Usage](https://docs.mirix.io/user-guide/backend-usage/). Basically, you just need to run:
```
git clone git@github.com:Mirix-AI/MIRIX.git
cd MIRIX

# Create and activate virtual environment
python -m venv mirix_env
source mirix_env/bin/activate  # On Windows: mirix_env\Scripts\activate

pip install -r requirements.txt
```
Then you can run the following python code:
```python
from mirix.agent import AgentWrapper

# Initialize agent with configuration
agent = AgentWrapper("./mirix/configs/mirix.yaml")

# Send basic text information
agent.send_message(
    message="The moon now has a president.",
    memorizing=True,
    force_absorb_content=True
)
```
For more details, please refer to [Backend Usage](https://docs.mirix.io/user-guide/backend-usage/).

## Python SDK (NEW!) 🎉

We've created a simple [Python SDK](https://pypi.org/project/mirix/0.1.5/) that makes it incredibly easy to integrate Mirix's memory capabilities into your applications:

### Installationhttps://pypi.org/project/mirix/0.1.5/
```bash
pip install mirix
```

### Quick Start with SDK
```python
from mirix import Mirix

# Initialize memory agent (defaults to Google Gemini 2.0 Flash)
memory_agent = Mirix(api_key="your-google-api-key")

# Add memories
memory_agent.add("The moon now has a president")
memory_agent.add("John loves Italian food and is allergic to peanuts")

# Chat with memory context
response = memory_agent.chat("Does the moon have a president?")
print(response)  # "Yes, according to my memory, the moon has a president."

response = memory_agent.chat("What does John like to eat?") 
print(response)  # "John loves Italian food. However, he's allergic to peanuts."
```

## Integration with Claude Agent SDK 🤝

Mirix can be integrated with [Anthropic's Claude Agent SDK](https://docs.claude.com/en/api/agent-sdk/python) to give Claude persistent memory across conversations. This allows Claude to remember context, user preferences, and past interactions.

### Basic Setup

Here's a simple example of integrating Mirix with the Claude Agent SDK:

```python
#!/usr/bin/env python3
import os
import asyncio
from claude_agent_sdk import query, ClaudeAgentOptions, AssistantMessage
from mirix import Mirix
from collections import deque
from dotenv import load_dotenv
load_dotenv()

# Configuration
MEMORY_UPDATE_INTERVAL = 3  # Update memory every N turns
REINIT_INTERVAL = 1  # Rebuild system prompt (retrieve from Mirix) every N turns
KEEP_LAST_N_TURNS = 50  # Keep last N turns in memory buffer

def build_system_prompt(mirix_agent=None, user_id=None, conversation_buffer=""):
    """Build system prompt with optional Mirix memory context"""
    system_prompt = """You are a helpful assistant."""
    
    # Add Mirix memory context if available
    if mirix_agent and user_id and conversation_buffer:
        memory_context = mirix_agent.extract_memory_for_system_prompt(
            conversation_buffer, user_id
        )
        if memory_context:
            system_prompt += "\n\nRelevant Memory Context:\n" + memory_context
    
    return system_prompt

async def run_agent():
    """Run Claude Agent SDK with Mirix memory integration"""
    
    # Initialize Mirix memory agent
    mirix_agent = Mirix(
        model_name="gemini-2.0-flash",
        api_key=os.getenv("GEMINI_API_KEY"),
    )
    user = mirix_agent.create_user(user_name="Alice")
    
    # Track conversation for memory updates
    conversation_history = deque(maxlen=KEEP_LAST_N_TURNS)
    turn_count = 0
    turns_since_reinit = 0
    session_id = None
    
    while True:
        # Build system prompt with memory context
        options = ClaudeAgentOptions(
            resume=session_id,
            allowed_tools=["Task", "Bash", "Read", "Edit", "Write", "WebSearch"],
            system_prompt=build_system_prompt(
                mirix_agent, user.id, 
                "\n".join([f"User: {u}\nAssistant: {a}" for u, a in conversation_history])
            ),
            model="claude-sonnet-4-5",
            max_turns=50
        )
        
        user_input = input("User: ").strip()
        if user_input.lower() in ['exit', 'quit', 'bye']:
            break
        
        # Get Claude's response
        assistant_response = ""
        async for message in query(prompt=user_input, options=options):
            if hasattr(message, 'subtype') and message.subtype == 'init':
                session_id = message.data.get('session_id')
            
            if isinstance(message, AssistantMessage):
                for block in message.content:
                    if hasattr(block, 'text'):
                        assistant_response += block.text
                        print(block.text, flush=True)
        
        # Update conversation history
        conversation_history.append((user_input, assistant_response))
        turn_count += 1
        turns_since_reinit += 1
        
        # Periodically update Mirix memory
        if turn_count % MEMORY_UPDATE_INTERVAL == 0:
            combined = "\n".join([
                f"[User] {u}\n[Assistant] {a}" 
                for u, a in conversation_history
            ])
            await asyncio.to_thread(mirix_agent.add, combined, user_id=user.id)
            print("✅ Memory updated!")

if __name__ == "__main__":
    asyncio.run(run_agent())
```

### Key Benefits

- **Persistent Memory:** Mirix helps Claude remember facts, preferences, and context across sessions
- **Intelligent Retrieval:** Mirix automatically retrieves relevant memories for each conversation
- **Scalable:** Works with conversations of any length without token limit issues
- **Flexible Updates:** Configure how often to update memory (e.g., every N turns)

### Example Usage

```bash
cd samples

# Install dependencies
pip install mirix claude-agent-sdk python-dotenv

# Set environment variables
export GEMINI_API_KEY="your-google-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"

# Run the agent
python claude_agent.py
```

## License

Mirix is released under the Apache License 2.0. See the [LICENSE](LICENSE) file for more details.

## Contact

For questions, suggestions, or issues, please open an issue on the GitHub repository or contact us at `yuwang@mirix.io`

## Join Our Community

Connect with other Mirix users, share your thoughts, and get support:

### 💬 Discord Community
Join our Discord server for real-time discussions, support, and community updates:
**[https://discord.gg/S6CeHNrJ](https://discord.gg/S6CeHNrJ)**

### 🎯 Weekly Discussion Sessions
We host weekly discussion sessions where you can:
- Discuss issues and bugs
- Share ideas about future directions
- Get general consultations and support
- Connect with the development team and community

**📅 Schedule:** Friday nights, 8-9 PM PST  
**🔗 Zoom Link:** [https://ucsd.zoom.us/j/96278791276](https://ucsd.zoom.us/j/96278791276)

### 📱 WeChat Group
<div align="center">
<img src="frontend/public/wechat-qr.jpg" alt="WeChat QR Code" width="200"/><br/>
<strong>WeChat Group</strong>
</div>

## Star History

[![Star History Chart](https://api.star-history.com/svg?repos=Mirix-AI/MIRIX&type=Date)](https://star-history.com/#Mirix-AI/MIRIX.&Date)

## Acknowledgement
We would like to thank [Letta](https://github.com/letta-ai/letta) for open-sourcing their framework, which served as the foundation for the memory system in this project.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Mirix-AI/MIRIX",
    "name": "jl-ecms-client",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "ai, memory, agent, llm, assistant, client, api",
    "author": "Mirix AI",
    "author_email": "yuwang@mirix.io",
    "download_url": "https://files.pythonhosted.org/packages/11/43/d471b553731f1aa0642dbb3deb03e1d9660bbc847c1dd49a5b1ba8b3a128/jl_ecms_client-0.3.0.tar.gz",
    "platform": null,
    "description": "![Mirix Logo](https://github.com/RenKoya1/MIRIX/raw/main/assets/logo.png)\n\n## MIRIX - Multi-Agent Personal Assistant with an Advanced Memory System\n\nYour personal AI that builds memory through screen observation and natural conversation\n\n| \ud83c\udf10 [Website](https://mirix.io) | \ud83d\udcda [Documentation](https://docs.mirix.io) | \ud83d\udcc4 [Paper](https://arxiv.org/abs/2507.07957) | \ud83d\udcac [Discord](https://discord.gg/S6CeHNrJ) \n<!-- | [Twitter/X](https://twitter.com/mirix_ai) | [Discord](https://discord.gg/S6CeHNrJ) | -->\n\n---\n\n### Key Features \ud83d\udd25\n\n- **Multi-Agent Memory System:** Six specialized memory components (Core, Episodic, Semantic, Procedural, Resource, Knowledge Vault) managed by dedicated agents\n- **Screen Activity Tracking:** Continuous visual data capture and intelligent consolidation into structured memories  \n- **Privacy-First Design:** All long-term data stored locally with user-controlled privacy settings\n- **Advanced Search:** PostgreSQL-native BM25 full-text search with vector similarity support\n- **Multi-Modal Input:** Text, images, voice, and screen captures processed seamlessly\n\n### Quick Start\n**End-Users**: For end-users who want to build your own memory using MIRIX, please checkout the quick installation guide [here](https://docs.mirix.io/getting-started/installation/#quick-installation-dmg).\n\n**Developers**: For users who want to apply our memory system as the backend, please check out our [Backend Usage](https://docs.mirix.io/user-guide/backend-usage/). Basically, you just need to run:\n```\ngit clone git@github.com:Mirix-AI/MIRIX.git\ncd MIRIX\n\n# Create and activate virtual environment\npython -m venv mirix_env\nsource mirix_env/bin/activate  # On Windows: mirix_env\\Scripts\\activate\n\npip install -r requirements.txt\n```\nThen you can run the following python code:\n```python\nfrom mirix.agent import AgentWrapper\n\n# Initialize agent with configuration\nagent = AgentWrapper(\"./mirix/configs/mirix.yaml\")\n\n# Send basic text information\nagent.send_message(\n    message=\"The moon now has a president.\",\n    memorizing=True,\n    force_absorb_content=True\n)\n```\nFor more details, please refer to [Backend Usage](https://docs.mirix.io/user-guide/backend-usage/).\n\n## Python SDK (NEW!) \ud83c\udf89\n\nWe've created a simple [Python SDK](https://pypi.org/project/mirix/0.1.5/) that makes it incredibly easy to integrate Mirix's memory capabilities into your applications:\n\n### Installationhttps://pypi.org/project/mirix/0.1.5/\n```bash\npip install mirix\n```\n\n### Quick Start with SDK\n```python\nfrom mirix import Mirix\n\n# Initialize memory agent (defaults to Google Gemini 2.0 Flash)\nmemory_agent = Mirix(api_key=\"your-google-api-key\")\n\n# Add memories\nmemory_agent.add(\"The moon now has a president\")\nmemory_agent.add(\"John loves Italian food and is allergic to peanuts\")\n\n# Chat with memory context\nresponse = memory_agent.chat(\"Does the moon have a president?\")\nprint(response)  # \"Yes, according to my memory, the moon has a president.\"\n\nresponse = memory_agent.chat(\"What does John like to eat?\") \nprint(response)  # \"John loves Italian food. However, he's allergic to peanuts.\"\n```\n\n## Integration with Claude Agent SDK \ud83e\udd1d\n\nMirix can be integrated with [Anthropic's Claude Agent SDK](https://docs.claude.com/en/api/agent-sdk/python) to give Claude persistent memory across conversations. This allows Claude to remember context, user preferences, and past interactions.\n\n### Basic Setup\n\nHere's a simple example of integrating Mirix with the Claude Agent SDK:\n\n```python\n#!/usr/bin/env python3\nimport os\nimport asyncio\nfrom claude_agent_sdk import query, ClaudeAgentOptions, AssistantMessage\nfrom mirix import Mirix\nfrom collections import deque\nfrom dotenv import load_dotenv\nload_dotenv()\n\n# Configuration\nMEMORY_UPDATE_INTERVAL = 3  # Update memory every N turns\nREINIT_INTERVAL = 1  # Rebuild system prompt (retrieve from Mirix) every N turns\nKEEP_LAST_N_TURNS = 50  # Keep last N turns in memory buffer\n\ndef build_system_prompt(mirix_agent=None, user_id=None, conversation_buffer=\"\"):\n    \"\"\"Build system prompt with optional Mirix memory context\"\"\"\n    system_prompt = \"\"\"You are a helpful assistant.\"\"\"\n    \n    # Add Mirix memory context if available\n    if mirix_agent and user_id and conversation_buffer:\n        memory_context = mirix_agent.extract_memory_for_system_prompt(\n            conversation_buffer, user_id\n        )\n        if memory_context:\n            system_prompt += \"\\n\\nRelevant Memory Context:\\n\" + memory_context\n    \n    return system_prompt\n\nasync def run_agent():\n    \"\"\"Run Claude Agent SDK with Mirix memory integration\"\"\"\n    \n    # Initialize Mirix memory agent\n    mirix_agent = Mirix(\n        model_name=\"gemini-2.0-flash\",\n        api_key=os.getenv(\"GEMINI_API_KEY\"),\n    )\n    user = mirix_agent.create_user(user_name=\"Alice\")\n    \n    # Track conversation for memory updates\n    conversation_history = deque(maxlen=KEEP_LAST_N_TURNS)\n    turn_count = 0\n    turns_since_reinit = 0\n    session_id = None\n    \n    while True:\n        # Build system prompt with memory context\n        options = ClaudeAgentOptions(\n            resume=session_id,\n            allowed_tools=[\"Task\", \"Bash\", \"Read\", \"Edit\", \"Write\", \"WebSearch\"],\n            system_prompt=build_system_prompt(\n                mirix_agent, user.id, \n                \"\\n\".join([f\"User: {u}\\nAssistant: {a}\" for u, a in conversation_history])\n            ),\n            model=\"claude-sonnet-4-5\",\n            max_turns=50\n        )\n        \n        user_input = input(\"User: \").strip()\n        if user_input.lower() in ['exit', 'quit', 'bye']:\n            break\n        \n        # Get Claude's response\n        assistant_response = \"\"\n        async for message in query(prompt=user_input, options=options):\n            if hasattr(message, 'subtype') and message.subtype == 'init':\n                session_id = message.data.get('session_id')\n            \n            if isinstance(message, AssistantMessage):\n                for block in message.content:\n                    if hasattr(block, 'text'):\n                        assistant_response += block.text\n                        print(block.text, flush=True)\n        \n        # Update conversation history\n        conversation_history.append((user_input, assistant_response))\n        turn_count += 1\n        turns_since_reinit += 1\n        \n        # Periodically update Mirix memory\n        if turn_count % MEMORY_UPDATE_INTERVAL == 0:\n            combined = \"\\n\".join([\n                f\"[User] {u}\\n[Assistant] {a}\" \n                for u, a in conversation_history\n            ])\n            await asyncio.to_thread(mirix_agent.add, combined, user_id=user.id)\n            print(\"\u2705 Memory updated!\")\n\nif __name__ == \"__main__\":\n    asyncio.run(run_agent())\n```\n\n### Key Benefits\n\n- **Persistent Memory:** Mirix helps Claude remember facts, preferences, and context across sessions\n- **Intelligent Retrieval:** Mirix automatically retrieves relevant memories for each conversation\n- **Scalable:** Works with conversations of any length without token limit issues\n- **Flexible Updates:** Configure how often to update memory (e.g., every N turns)\n\n### Example Usage\n\n```bash\ncd samples\n\n# Install dependencies\npip install mirix claude-agent-sdk python-dotenv\n\n# Set environment variables\nexport GEMINI_API_KEY=\"your-google-api-key\"\nexport ANTHROPIC_API_KEY=\"your-anthropic-api-key\"\n\n# Run the agent\npython claude_agent.py\n```\n\n## License\n\nMirix is released under the Apache License 2.0. See the [LICENSE](LICENSE) file for more details.\n\n## Contact\n\nFor questions, suggestions, or issues, please open an issue on the GitHub repository or contact us at `yuwang@mirix.io`\n\n## Join Our Community\n\nConnect with other Mirix users, share your thoughts, and get support:\n\n### \ud83d\udcac Discord Community\nJoin our Discord server for real-time discussions, support, and community updates:\n**[https://discord.gg/S6CeHNrJ](https://discord.gg/S6CeHNrJ)**\n\n### \ud83c\udfaf Weekly Discussion Sessions\nWe host weekly discussion sessions where you can:\n- Discuss issues and bugs\n- Share ideas about future directions\n- Get general consultations and support\n- Connect with the development team and community\n\n**\ud83d\udcc5 Schedule:** Friday nights, 8-9 PM PST  \n**\ud83d\udd17 Zoom Link:** [https://ucsd.zoom.us/j/96278791276](https://ucsd.zoom.us/j/96278791276)\n\n### \ud83d\udcf1 WeChat Group\n<div align=\"center\">\n<img src=\"frontend/public/wechat-qr.jpg\" alt=\"WeChat QR Code\" width=\"200\"/><br/>\n<strong>WeChat Group</strong>\n</div>\n\n## Star History\n\n[![Star History Chart](https://api.star-history.com/svg?repos=Mirix-AI/MIRIX&type=Date)](https://star-history.com/#Mirix-AI/MIRIX.&Date)\n\n## Acknowledgement\nWe would like to thank [Letta](https://github.com/letta-ai/letta) for open-sourcing their framework, which served as the foundation for the memory system in this project.\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "Intuit ECMS Client - Lightweight Python client for Intuit ECMS server",
    "version": "0.3.0",
    "project_urls": {
        "Bug Reports": "https://github.com/Mirix-AI/MIRIX/issues",
        "Documentation": "https://docs.mirix.io",
        "Homepage": "https://github.com/Mirix-AI/MIRIX",
        "Source Code": "https://github.com/Mirix-AI/MIRIX",
        "Website": "https://mirix.io"
    },
    "split_keywords": [
        "ai",
        " memory",
        " agent",
        " llm",
        " assistant",
        " client",
        " api"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "21aa9a41bf2170abfca3d295e6500ad69d66a60798ad1a6132a2d6758f582f0d",
                "md5": "d50832bb7a17294bd737b83c768307fb",
                "sha256": "16460cffbd5512fae3ff122e6ce843d14686b25ca1cd21836370047225d575c6"
            },
            "downloads": -1,
            "filename": "jl_ecms_client-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d50832bb7a17294bd737b83c768307fb",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 133649,
            "upload_time": "2025-11-11T22:20:39",
            "upload_time_iso_8601": "2025-11-11T22:20:39.675836Z",
            "url": "https://files.pythonhosted.org/packages/21/aa/9a41bf2170abfca3d295e6500ad69d66a60798ad1a6132a2d6758f582f0d/jl_ecms_client-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1143d471b553731f1aa0642dbb3deb03e1d9660bbc847c1dd49a5b1ba8b3a128",
                "md5": "888be44af7cc1c11b56c402aa96889f4",
                "sha256": "b61193882646dc71cff76bbbc724bf72f46cfaa3f95be894416eea194b048ded"
            },
            "downloads": -1,
            "filename": "jl_ecms_client-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "888be44af7cc1c11b56c402aa96889f4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 1551215,
            "upload_time": "2025-11-11T22:20:42",
            "upload_time_iso_8601": "2025-11-11T22:20:42.601373Z",
            "url": "https://files.pythonhosted.org/packages/11/43/d471b553731f1aa0642dbb3deb03e1d9660bbc847c1dd49a5b1ba8b3a128/jl_ecms_client-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-11-11 22:20:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Mirix-AI",
    "github_project": "MIRIX",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "pytz",
            "specs": []
        },
        {
            "name": "numpy",
            "specs": []
        },
        {
            "name": "pandas",
            "specs": []
        },
        {
            "name": "openpyxl",
            "specs": []
        },
        {
            "name": "Markdown",
            "specs": []
        },
        {
            "name": "Pillow",
            "specs": [
                [
                    ">=",
                    "10.2.0"
                ],
                [
                    "<",
                    "11.0.0"
                ]
            ]
        },
        {
            "name": "scikit-image",
            "specs": []
        },
        {
            "name": "openai",
            "specs": [
                [
                    "==",
                    "1.72.0"
                ]
            ]
        },
        {
            "name": "tiktoken",
            "specs": []
        },
        {
            "name": "google-genai",
            "specs": []
        },
        {
            "name": "python-dotenv",
            "specs": []
        },
        {
            "name": "demjson3",
            "specs": []
        },
        {
            "name": "pathvalidate",
            "specs": []
        },
        {
            "name": "docstring_parser",
            "specs": []
        },
        {
            "name": "sqlalchemy",
            "specs": []
        },
        {
            "name": "pydantic-settings",
            "specs": []
        },
        {
            "name": "jinja2",
            "specs": []
        },
        {
            "name": "humps",
            "specs": []
        },
        {
            "name": "composio",
            "specs": []
        },
        {
            "name": "colorama",
            "specs": []
        },
        {
            "name": "anthropic",
            "specs": []
        },
        {
            "name": "httpx_sse",
            "specs": []
        },
        {
            "name": "rapidfuzz",
            "specs": []
        },
        {
            "name": "rank-bm25",
            "specs": []
        },
        {
            "name": "psutil",
            "specs": []
        },
        {
            "name": "llama_index",
            "specs": []
        },
        {
            "name": "llama-index-embeddings-google-genai",
            "specs": []
        },
        {
            "name": "fastapi",
            "specs": [
                [
                    ">=",
                    "0.104.1"
                ]
            ]
        },
        {
            "name": "uvicorn",
            "specs": [
                [
                    ">=",
                    "0.31.1"
                ]
            ]
        },
        {
            "name": "pydub",
            "specs": []
        },
        {
            "name": "python-multipart",
            "specs": []
        },
        {
            "name": "opentelemetry-api",
            "specs": []
        },
        {
            "name": "opentelemetry-sdk",
            "specs": []
        },
        {
            "name": "opentelemetry-exporter-otlp",
            "specs": []
        },
        {
            "name": "opentelemetry-instrumentation-requests",
            "specs": []
        },
        {
            "name": "SpeechRecognition",
            "specs": []
        },
        {
            "name": "ffmpeg",
            "specs": []
        },
        {
            "name": "pg8000",
            "specs": []
        },
        {
            "name": "pgvector",
            "specs": []
        },
        {
            "name": "json_repair",
            "specs": []
        },
        {
            "name": "rich",
            "specs": [
                [
                    ">=",
                    "13.7.1"
                ],
                [
                    "<",
                    "14.0.0"
                ]
            ]
        },
        {
            "name": "psycopg2-binary",
            "specs": []
        },
        {
            "name": "anyio",
            "specs": [
                [
                    ">=",
                    "4.7.0"
                ]
            ]
        },
        {
            "name": "mcp",
            "specs": []
        },
        {
            "name": "google-auth",
            "specs": []
        },
        {
            "name": "google-auth-oauthlib",
            "specs": []
        },
        {
            "name": "google-auth-httplib2",
            "specs": []
        },
        {
            "name": "google-api-python-client",
            "specs": []
        },
        {
            "name": "ruff",
            "specs": []
        },
        {
            "name": "nodeenv",
            "specs": []
        },
        {
            "name": "pyright",
            "specs": []
        },
        {
            "name": "pytest",
            "specs": []
        }
    ],
    "lcname": "jl-ecms-client"
}
        
Elapsed time: 2.26046s