# Agent Zero Lite
A lightweight, cross-platform implementation of Agent Zero that maintains core functionality while reducing complexity and dependencies.
## Features
✅ **Full LiteLLM Support** - 100+ AI providers (OpenAI, Anthropic, Google, local models, etc.)
✅ **Web UI** - Complete interface at localhost:50001
✅ **Vector Memory** - FAISS-based persistent memory
✅ **Document RAG** - PDF, text, and document processing
✅ **Multi-Agent** - Superior/subordinate agent hierarchy
✅ **MCP Client** - Model Context Protocol integration
✅ **Local Execution** - Python, Node.js, and terminal
✅ **Tunneling** - Remote access support
✅ **File Management** - Work directory browser
## Removed from Full Version
❌ Browser automation (Playwright)
❌ Docker/SSH execution
❌ Speech processing (STT/TTS)
❌ Task scheduling
❌ Backup/restore system
❌ Web search tools
## Quick Start
1. **Install dependencies:**
```bash
pip install -r requirements.txt
```
2. **Configure environment:**
```bash
cp .env.example .env
# Edit .env with your API keys
```
3. **Start the Web UI:**
```bash
python run_ui.py
```
4. **Open browser:**
```
http://localhost:50001
```
## Configuration
### Minimal Setup
Set at least one LLM provider in `.env`:
```bash
# OpenAI
OPENAI_API_KEY=sk-...
# Or Anthropic
ANTHROPIC_API_KEY=sk-ant-...
# Or local Ollama
CHAT_MODEL_PROVIDER=ollama
CHAT_MODEL_NAME=llama3.1:8b
OLLAMA_API_BASE=http://localhost:11434
```
### Full Configuration
See `.env.example` for all available options including:
- All 100+ LiteLLM providers
- Model configurations
- Rate limiting settings
- MCP server integration
- Memory and knowledge settings
## Supported Models
Agent Zero Lite supports **all LiteLLM providers**:
### Commercial APIs
- **OpenAI:** GPT-4o, GPT-4, GPT-3.5, etc.
- **Anthropic:** Claude 3.5 Sonnet, Claude 3 Opus, etc.
- **Google:** Gemini 1.5 Pro, Gemini 1.5 Flash, etc.
- **Groq:** Llama 3.1, Mixtral, etc. (fast inference)
- **Together AI:** Llama, Mistral, etc.
- **Mistral AI:** Mistral Large, Mistral 7B, etc.
- **Cohere:** Command R+, Command Light, etc.
### Local Models
- **Ollama:** Any local model (llama3.1, mistral, etc.)
- **LM Studio:** Local model server
- **Text Generation WebUI:** Local inference
- **VLLM:** High-performance inference server
### Enterprise
- **Azure OpenAI:** Enterprise GPT models
- **AWS Bedrock:** Claude, Titan, etc.
- **Google Vertex AI:** Enterprise Gemini
- **Hugging Face:** Hosted models
## Usage Examples
### Basic Chat
```python
from agent import AgentContext
import initialize
# Initialize agent
config = initialize.initialize_agent()
context = AgentContext(config)
# Send message
response = context.communicate("Hello, what can you help me with?")
```
### Code Execution
The agent can execute Python, Node.js, and terminal commands:
```
User: "Create a Python script that calculates fibonacci numbers"
Agent: Uses code_execution tool to write and run Python code
```
### Document Processing
```
User: "Analyze this PDF document and summarize the key points"
Agent: Uses document_query tool to process and analyze documents
```
### Multi-Agent Collaboration
```
User: "Create a complex analysis using multiple specialized agents"
Agent: Uses call_subordinate to delegate tasks to specialized sub-agents
```
## Architecture
Agent Zero Lite maintains the core Agent Zero architecture:
- **Agent Loop:** Reason → Tool Use → Response cycle
- **Tool System:** Extensible plugin architecture
- **Memory:** FAISS vector database for persistent memory
- **Extensions:** Hook-based system for customization
- **Prompts:** Template-based prompt management
## Development
### Adding Tools
Create new tools in `python/tools/`:
```python
from python.helpers.tool import Tool, Response
class MyTool(Tool):
async def execute(self, **kwargs):
# Tool logic here
return Response(message="result", break_loop=False)
```
### Adding Extensions
Create extensions in `python/extensions/`:
```python
from python.helpers.extension import Extension
class MyExtension(Extension):
async def execute(self, **kwargs):
# Extension logic here
pass
```
## Troubleshooting
### Common Issues
1. **Model not responding:** Check API keys in `.env`
2. **Port in use:** Change PORT in `.env`
3. **Memory issues:** Reduce context length settings
4. **Missing dependencies:** Run `pip install -r requirements.txt`
### Debugging
Enable debug logging by setting:
```bash
LITELLM_LOG=DEBUG
```
## Migration
### From Full Agent Zero
1. Copy `.env` settings
2. Copy `memory/` and `knowledge/` folders
3. Copy `work_dir/` contents
4. Remove Docker/SSH configurations
### To Full Agent Zero
1. Install additional dependencies
2. Add Docker/SSH configurations
3. No data migration needed
## Performance
Agent Zero Lite is optimized for:
- **Startup:** ~3 seconds vs 15+ seconds
- **Memory:** ~200MB vs 1GB+ RAM usage
- **Dependencies:** ~30 packages vs 45+ packages
- **Installation:** <2 minutes vs 10+ minutes
## License
Same as Agent Zero - check the original repository for license terms.
## Support
For issues and questions:
1. Check this README
2. Review `.env.example` configuration
3. See the original Agent Zero documentation
4. Report issues to the Agent Zero repository
Raw data
{
"_id": null,
"home_page": "https://github.com/frdel/agent-zero-lite",
"name": "agent-zero-lite",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "Agent Zero Community <support@agent-zero.io>",
"keywords": "ai, agent, llm, python, automation, web-ui, litellm, openai, anthropic",
"author": "Agent Zero Community",
"author_email": "Agent Zero Community <support@agent-zero.io>",
"download_url": "https://files.pythonhosted.org/packages/35/aa/aba906274e015af94134f10b593cf1caf7433ccf7b99a337cb07336a5efc/agent_zero_lite-1.0.12.tar.gz",
"platform": null,
"description": "# Agent Zero Lite\n\nA lightweight, cross-platform implementation of Agent Zero that maintains core functionality while reducing complexity and dependencies.\n\n## Features\n\n\u2705 **Full LiteLLM Support** - 100+ AI providers (OpenAI, Anthropic, Google, local models, etc.) \n\u2705 **Web UI** - Complete interface at localhost:50001 \n\u2705 **Vector Memory** - FAISS-based persistent memory \n\u2705 **Document RAG** - PDF, text, and document processing \n\u2705 **Multi-Agent** - Superior/subordinate agent hierarchy \n\u2705 **MCP Client** - Model Context Protocol integration \n\u2705 **Local Execution** - Python, Node.js, and terminal \n\u2705 **Tunneling** - Remote access support \n\u2705 **File Management** - Work directory browser \n\n## Removed from Full Version\n\n\u274c Browser automation (Playwright) \n\u274c Docker/SSH execution \n\u274c Speech processing (STT/TTS) \n\u274c Task scheduling \n\u274c Backup/restore system \n\u274c Web search tools \n\n## Quick Start\n\n1. **Install dependencies:**\n ```bash\n pip install -r requirements.txt\n ```\n\n2. **Configure environment:**\n ```bash\n cp .env.example .env\n # Edit .env with your API keys\n ```\n\n3. **Start the Web UI:**\n ```bash\n python run_ui.py\n ```\n\n4. **Open browser:**\n ```\n http://localhost:50001\n ```\n\n## Configuration\n\n### Minimal Setup\nSet at least one LLM provider in `.env`:\n\n```bash\n# OpenAI\nOPENAI_API_KEY=sk-...\n\n# Or Anthropic\nANTHROPIC_API_KEY=sk-ant-...\n\n# Or local Ollama\nCHAT_MODEL_PROVIDER=ollama\nCHAT_MODEL_NAME=llama3.1:8b\nOLLAMA_API_BASE=http://localhost:11434\n```\n\n### Full Configuration\nSee `.env.example` for all available options including:\n- All 100+ LiteLLM providers\n- Model configurations \n- Rate limiting settings\n- MCP server integration\n- Memory and knowledge settings\n\n## Supported Models\n\nAgent Zero Lite supports **all LiteLLM providers**:\n\n### Commercial APIs\n- **OpenAI:** GPT-4o, GPT-4, GPT-3.5, etc.\n- **Anthropic:** Claude 3.5 Sonnet, Claude 3 Opus, etc.\n- **Google:** Gemini 1.5 Pro, Gemini 1.5 Flash, etc.\n- **Groq:** Llama 3.1, Mixtral, etc. (fast inference)\n- **Together AI:** Llama, Mistral, etc.\n- **Mistral AI:** Mistral Large, Mistral 7B, etc.\n- **Cohere:** Command R+, Command Light, etc.\n\n### Local Models\n- **Ollama:** Any local model (llama3.1, mistral, etc.)\n- **LM Studio:** Local model server\n- **Text Generation WebUI:** Local inference\n- **VLLM:** High-performance inference server\n\n### Enterprise\n- **Azure OpenAI:** Enterprise GPT models\n- **AWS Bedrock:** Claude, Titan, etc.\n- **Google Vertex AI:** Enterprise Gemini\n- **Hugging Face:** Hosted models\n\n## Usage Examples\n\n### Basic Chat\n```python\nfrom agent import AgentContext\nimport initialize\n\n# Initialize agent\nconfig = initialize.initialize_agent()\ncontext = AgentContext(config)\n\n# Send message\nresponse = context.communicate(\"Hello, what can you help me with?\")\n```\n\n### Code Execution\nThe agent can execute Python, Node.js, and terminal commands:\n\n```\nUser: \"Create a Python script that calculates fibonacci numbers\"\nAgent: Uses code_execution tool to write and run Python code\n```\n\n### Document Processing\n```\nUser: \"Analyze this PDF document and summarize the key points\"\nAgent: Uses document_query tool to process and analyze documents\n```\n\n### Multi-Agent Collaboration\n```\nUser: \"Create a complex analysis using multiple specialized agents\"\nAgent: Uses call_subordinate to delegate tasks to specialized sub-agents\n```\n\n## Architecture\n\nAgent Zero Lite maintains the core Agent Zero architecture:\n\n- **Agent Loop:** Reason \u2192 Tool Use \u2192 Response cycle\n- **Tool System:** Extensible plugin architecture \n- **Memory:** FAISS vector database for persistent memory\n- **Extensions:** Hook-based system for customization\n- **Prompts:** Template-based prompt management\n\n## Development\n\n### Adding Tools\nCreate new tools in `python/tools/`:\n\n```python\nfrom python.helpers.tool import Tool, Response\n\nclass MyTool(Tool):\n async def execute(self, **kwargs):\n # Tool logic here\n return Response(message=\"result\", break_loop=False)\n```\n\n### Adding Extensions\nCreate extensions in `python/extensions/`:\n\n```python\nfrom python.helpers.extension import Extension\n\nclass MyExtension(Extension):\n async def execute(self, **kwargs):\n # Extension logic here\n pass\n```\n\n## Troubleshooting\n\n### Common Issues\n\n1. **Model not responding:** Check API keys in `.env`\n2. **Port in use:** Change PORT in `.env` \n3. **Memory issues:** Reduce context length settings\n4. **Missing dependencies:** Run `pip install -r requirements.txt`\n\n### Debugging\n\nEnable debug logging by setting:\n```bash\nLITELLM_LOG=DEBUG\n```\n\n## Migration\n\n### From Full Agent Zero\n1. Copy `.env` settings\n2. Copy `memory/` and `knowledge/` folders \n3. Copy `work_dir/` contents\n4. Remove Docker/SSH configurations\n\n### To Full Agent Zero \n1. Install additional dependencies\n2. Add Docker/SSH configurations\n3. No data migration needed\n\n## Performance\n\nAgent Zero Lite is optimized for:\n- **Startup:** ~3 seconds vs 15+ seconds\n- **Memory:** ~200MB vs 1GB+ RAM usage \n- **Dependencies:** ~30 packages vs 45+ packages\n- **Installation:** <2 minutes vs 10+ minutes\n\n## License\n\nSame as Agent Zero - check the original repository for license terms.\n\n## Support\n\nFor issues and questions:\n1. Check this README\n2. Review `.env.example` configuration\n3. See the original Agent Zero documentation\n4. Report issues to the Agent Zero repository\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Lightweight Python AI Agent Framework with Web UI",
"version": "1.0.12",
"project_urls": {
"Bug Reports": "https://github.com/frdel/agent-zero-lite/issues",
"Changelog": "https://github.com/frdel/agent-zero-lite/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/frdel/agent-zero-lite#readme",
"Homepage": "https://github.com/frdel/agent-zero-lite",
"Repository": "https://github.com/frdel/agent-zero-lite"
},
"split_keywords": [
"ai",
" agent",
" llm",
" python",
" automation",
" web-ui",
" litellm",
" openai",
" anthropic"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "14415ebaf1d2d56ada48367406a1b633d269f446ce4db7f5c03553541b0f643e",
"md5": "7d2f70c66cd138853163769ca1414399",
"sha256": "5c06011fe1a9ed70be411273e60eb6b72e05aa753203d1b5c7d70e4901e3062b"
},
"downloads": -1,
"filename": "agent_zero_lite-1.0.12-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7d2f70c66cd138853163769ca1414399",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 1535467,
"upload_time": "2025-08-06T07:19:11",
"upload_time_iso_8601": "2025-08-06T07:19:11.234607Z",
"url": "https://files.pythonhosted.org/packages/14/41/5ebaf1d2d56ada48367406a1b633d269f446ce4db7f5c03553541b0f643e/agent_zero_lite-1.0.12-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "35aaaba906274e015af94134f10b593cf1caf7433ccf7b99a337cb07336a5efc",
"md5": "17859a008ce8cbb86155456851435c67",
"sha256": "5677e915980a5c767993544f78c4f5a72bc7826343d1cfe376997c896b407021"
},
"downloads": -1,
"filename": "agent_zero_lite-1.0.12.tar.gz",
"has_sig": false,
"md5_digest": "17859a008ce8cbb86155456851435c67",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 1438783,
"upload_time": "2025-08-06T07:19:12",
"upload_time_iso_8601": "2025-08-06T07:19:12.640098Z",
"url": "https://files.pythonhosted.org/packages/35/aa/aba906274e015af94134f10b593cf1caf7433ccf7b99a337cb07336a5efc/agent_zero_lite-1.0.12.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-06 07:19:12",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "frdel",
"github_project": "agent-zero-lite",
"github_not_found": true,
"lcname": "agent-zero-lite"
}