# Agent Zero Lite
A lightweight, cross-platform implementation of Agent Zero that maintains core functionality while reducing complexity and dependencies.
## Features
✅ **Full LiteLLM Support** - 100+ AI providers (OpenAI, Anthropic, Google, local models, etc.)
✅ **Web UI** - Complete interface at localhost:50001
✅ **Vector Memory** - FAISS-based persistent memory
✅ **Document RAG** - PDF, text, and document processing
✅ **Multi-Agent** - Superior/subordinate agent hierarchy
✅ **MCP Client** - Model Context Protocol integration
✅ **Local Execution** - Python, Node.js, and terminal
✅ **Tunneling** - Remote access support
✅ **File Management** - Work directory browser
## Removed from Full Version
❌ Browser automation (Playwright)
❌ Docker/SSH execution
❌ Speech processing (STT/TTS)
❌ Task scheduling
❌ Backup/restore system
❌ Web search tools
## Quick Start
1. Install (CPU-only by default):
```bash
pip install agent-zero-lite
```
- Optional extras:
- CPU ML helpers (additional ONNX/Transformers utilities):
```bash
pip install "agent-zero-lite[cpu]"
```
- Transformers stack (CPU) and ONNX runtime (sentence-transformers included by default):
```bash
pip install "agent-zero-lite[ml]"
```
- Audio transcription (Whisper, CPU):
```bash
pip install "agent-zero-lite[audio]"
```
- GPU stack (advanced; choose your CUDA build of torch separately if needed):
```bash
pip install "agent-zero-lite[gpu]"
# For PyTorch CUDA builds, see: https://pytorch.org/get-started/locally/
```
2. Configure environment:
```bash
cp .env.example .env
# Edit .env with your API keys
```
3. Start the Web UI:
```bash
python run_ui.py
```
4. **Open browser:**
```
http://localhost:50001
```
## Configuration
### Minimal Setup
Set at least one LLM provider in `.env`:
```bash
# OpenAI
OPENAI_API_KEY=sk-...
# Or Anthropic
ANTHROPIC_API_KEY=sk-ant-...
# Or local Ollama
CHAT_MODEL_PROVIDER=ollama
CHAT_MODEL_NAME=llama3.1:8b
OLLAMA_API_BASE=http://localhost:11434
```
### Full Configuration
See `.env.example` for all available options including:
- All 100+ LiteLLM providers
- Model configurations
- Rate limiting settings
- MCP server integration
- Memory and knowledge settings
## Supported Models
Agent Zero Lite supports **all LiteLLM providers**:
### Commercial APIs
- **OpenAI:** GPT-4o, GPT-4, GPT-3.5, etc.
- **Anthropic:** Claude 3.5 Sonnet, Claude 3 Opus, etc.
- **Google:** Gemini 1.5 Pro, Gemini 1.5 Flash, etc.
- **Groq:** Llama 3.1, Mixtral, etc. (fast inference)
- **Together AI:** Llama, Mistral, etc.
- **Mistral AI:** Mistral Large, Mistral 7B, etc.
- **Cohere:** Command R+, Command Light, etc.
### Local Models
- **Ollama:** Any local model (llama3.1, mistral, etc.)
- **LM Studio:** Local model server
- **Text Generation WebUI:** Local inference
- **VLLM:** High-performance inference server
### Enterprise
- **Azure OpenAI:** Enterprise GPT models
- **AWS Bedrock:** Claude, Titan, etc.
- **Google Vertex AI:** Enterprise Gemini
- **Hugging Face:** Hosted models
## Usage Examples
### Basic Chat
```python
from agent import AgentContext
import initialize
# Initialize agent
config = initialize.initialize_agent()
context = AgentContext(config)
# Send message
response = context.communicate("Hello, what can you help me with?")
```
### Code Execution
The agent can execute Python, Node.js, and terminal commands:
```
User: "Create a Python script that calculates fibonacci numbers"
Agent: Uses code_execution tool to write and run Python code
```
### Document Processing
```
User: "Analyze this PDF document and summarize the key points"
Agent: Uses document_query tool to process and analyze documents
```
### Multi-Agent Collaboration
```
User: "Create a complex analysis using multiple specialized agents"
Agent: Uses call_subordinate to delegate tasks to specialized sub-agents
```
## Architecture
Agent Zero Lite maintains the core Agent Zero architecture:
- **Agent Loop:** Reason → Tool Use → Response cycle
- **Tool System:** Extensible plugin architecture
- **Memory:** FAISS vector database for persistent memory
- **Extensions:** Hook-based system for customization
- **Prompts:** Template-based prompt management
## Development
### Adding Tools
Create new tools in `python/tools/`:
```python
from python.helpers.tool import Tool, Response
class MyTool(Tool):
async def execute(self, **kwargs):
# Tool logic here
return Response(message="result", break_loop=False)
```
### Adding Extensions
Create extensions in `python/extensions/`:
```python
from python.helpers.extension import Extension
class MyExtension(Extension):
async def execute(self, **kwargs):
# Extension logic here
pass
```
## Troubleshooting
### Common Issues
1. **Model not responding:** Check API keys in `.env`
2. **Port in use:** Change PORT in `.env`
3. **Memory issues:** Reduce context length settings
4. **Missing dependencies:** Run `pip install -r requirements.txt`
### Debugging
Enable debug logging by setting:
```bash
LITELLM_LOG=DEBUG
```
## Migration
### From Full Agent Zero
1. Copy `.env` settings
2. Copy `memory/` and `knowledge/` folders
3. Copy `work_dir/` contents
4. Remove Docker/SSH configurations
### To Full Agent Zero
1. Install additional dependencies
2. Add Docker/SSH configurations
3. No data migration needed
## Performance
Agent Zero Lite is optimized for:
- **Startup:** ~3 seconds vs 15+ seconds
- **Memory:** ~200MB vs 1GB+ RAM usage
- **Dependencies:** ~30 packages vs 45+ packages
- **Installation:** <2 minutes vs 10+ minutes
## License
Same as Agent Zero - check the original repository for license terms.
## Support
For issues and questions:
1. Check this README
2. Review `.env.example` configuration
3. See the original Agent Zero documentation
4. Report issues to the Agent Zero repository
Raw data
{
"_id": null,
"home_page": "https://github.com/frdel/agent-zero-lite",
"name": "agent-zero-lite",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "Agent Zero Community <support@agent-zero.io>",
"keywords": "ai, agent, llm, python, automation, web-ui, litellm, openai, anthropic",
"author": "Agent Zero Community",
"author_email": "Agent Zero Community <support@agent-zero.io>",
"download_url": "https://files.pythonhosted.org/packages/26/02/808d82dfb50dc00ece4022d3d3aa23283850432046f521103e64c96e2e1f/agent_zero_lite-1.0.16.tar.gz",
"platform": null,
"description": "# Agent Zero Lite\n\nA lightweight, cross-platform implementation of Agent Zero that maintains core functionality while reducing complexity and dependencies.\n\n## Features\n\n\u2705 **Full LiteLLM Support** - 100+ AI providers (OpenAI, Anthropic, Google, local models, etc.) \n\u2705 **Web UI** - Complete interface at localhost:50001 \n\u2705 **Vector Memory** - FAISS-based persistent memory \n\u2705 **Document RAG** - PDF, text, and document processing \n\u2705 **Multi-Agent** - Superior/subordinate agent hierarchy \n\u2705 **MCP Client** - Model Context Protocol integration \n\u2705 **Local Execution** - Python, Node.js, and terminal \n\u2705 **Tunneling** - Remote access support \n\u2705 **File Management** - Work directory browser \n\n## Removed from Full Version\n\n\u274c Browser automation (Playwright) \n\u274c Docker/SSH execution \n\u274c Speech processing (STT/TTS) \n\u274c Task scheduling \n\u274c Backup/restore system \n\u274c Web search tools \n\n## Quick Start\n\n1. Install (CPU-only by default):\n```bash\npip install agent-zero-lite\n```\n\n- Optional extras:\n - CPU ML helpers (additional ONNX/Transformers utilities):\n ```bash\n pip install \"agent-zero-lite[cpu]\"\n ```\n - Transformers stack (CPU) and ONNX runtime (sentence-transformers included by default):\n ```bash\n pip install \"agent-zero-lite[ml]\"\n ```\n - Audio transcription (Whisper, CPU):\n ```bash\n pip install \"agent-zero-lite[audio]\"\n ```\n - GPU stack (advanced; choose your CUDA build of torch separately if needed):\n ```bash\n pip install \"agent-zero-lite[gpu]\"\n # For PyTorch CUDA builds, see: https://pytorch.org/get-started/locally/\n ```\n\n2. Configure environment:\n```bash\ncp .env.example .env\n# Edit .env with your API keys\n```\n\n3. Start the Web UI:\n```bash\npython run_ui.py\n```\n\n4. **Open browser:**\n ```\n http://localhost:50001\n ```\n\n## Configuration\n\n### Minimal Setup\nSet at least one LLM provider in `.env`:\n\n```bash\n# OpenAI\nOPENAI_API_KEY=sk-...\n\n# Or Anthropic\nANTHROPIC_API_KEY=sk-ant-...\n\n# Or local Ollama\nCHAT_MODEL_PROVIDER=ollama\nCHAT_MODEL_NAME=llama3.1:8b\nOLLAMA_API_BASE=http://localhost:11434\n```\n\n### Full Configuration\nSee `.env.example` for all available options including:\n- All 100+ LiteLLM providers\n- Model configurations \n- Rate limiting settings\n- MCP server integration\n- Memory and knowledge settings\n\n## Supported Models\n\nAgent Zero Lite supports **all LiteLLM providers**:\n\n### Commercial APIs\n- **OpenAI:** GPT-4o, GPT-4, GPT-3.5, etc.\n- **Anthropic:** Claude 3.5 Sonnet, Claude 3 Opus, etc.\n- **Google:** Gemini 1.5 Pro, Gemini 1.5 Flash, etc.\n- **Groq:** Llama 3.1, Mixtral, etc. (fast inference)\n- **Together AI:** Llama, Mistral, etc.\n- **Mistral AI:** Mistral Large, Mistral 7B, etc.\n- **Cohere:** Command R+, Command Light, etc.\n\n### Local Models\n- **Ollama:** Any local model (llama3.1, mistral, etc.)\n- **LM Studio:** Local model server\n- **Text Generation WebUI:** Local inference\n- **VLLM:** High-performance inference server\n\n### Enterprise\n- **Azure OpenAI:** Enterprise GPT models\n- **AWS Bedrock:** Claude, Titan, etc.\n- **Google Vertex AI:** Enterprise Gemini\n- **Hugging Face:** Hosted models\n\n## Usage Examples\n\n### Basic Chat\n```python\nfrom agent import AgentContext\nimport initialize\n\n# Initialize agent\nconfig = initialize.initialize_agent()\ncontext = AgentContext(config)\n\n# Send message\nresponse = context.communicate(\"Hello, what can you help me with?\")\n```\n\n### Code Execution\nThe agent can execute Python, Node.js, and terminal commands:\n\n```\nUser: \"Create a Python script that calculates fibonacci numbers\"\nAgent: Uses code_execution tool to write and run Python code\n```\n\n### Document Processing\n```\nUser: \"Analyze this PDF document and summarize the key points\"\nAgent: Uses document_query tool to process and analyze documents\n```\n\n### Multi-Agent Collaboration\n```\nUser: \"Create a complex analysis using multiple specialized agents\"\nAgent: Uses call_subordinate to delegate tasks to specialized sub-agents\n```\n\n## Architecture\n\nAgent Zero Lite maintains the core Agent Zero architecture:\n\n- **Agent Loop:** Reason \u2192 Tool Use \u2192 Response cycle\n- **Tool System:** Extensible plugin architecture \n- **Memory:** FAISS vector database for persistent memory\n- **Extensions:** Hook-based system for customization\n- **Prompts:** Template-based prompt management\n\n## Development\n\n### Adding Tools\nCreate new tools in `python/tools/`:\n\n```python\nfrom python.helpers.tool import Tool, Response\n\nclass MyTool(Tool):\n async def execute(self, **kwargs):\n # Tool logic here\n return Response(message=\"result\", break_loop=False)\n```\n\n### Adding Extensions\nCreate extensions in `python/extensions/`:\n\n```python\nfrom python.helpers.extension import Extension\n\nclass MyExtension(Extension):\n async def execute(self, **kwargs):\n # Extension logic here\n pass\n```\n\n## Troubleshooting\n\n### Common Issues\n\n1. **Model not responding:** Check API keys in `.env`\n2. **Port in use:** Change PORT in `.env` \n3. **Memory issues:** Reduce context length settings\n4. **Missing dependencies:** Run `pip install -r requirements.txt`\n\n### Debugging\n\nEnable debug logging by setting:\n```bash\nLITELLM_LOG=DEBUG\n```\n\n## Migration\n\n### From Full Agent Zero\n1. Copy `.env` settings\n2. Copy `memory/` and `knowledge/` folders \n3. Copy `work_dir/` contents\n4. Remove Docker/SSH configurations\n\n### To Full Agent Zero \n1. Install additional dependencies\n2. Add Docker/SSH configurations\n3. No data migration needed\n\n## Performance\n\nAgent Zero Lite is optimized for:\n- **Startup:** ~3 seconds vs 15+ seconds\n- **Memory:** ~200MB vs 1GB+ RAM usage \n- **Dependencies:** ~30 packages vs 45+ packages\n- **Installation:** <2 minutes vs 10+ minutes\n\n## License\n\nSame as Agent Zero - check the original repository for license terms.\n\n## Support\n\nFor issues and questions:\n1. Check this README\n2. Review `.env.example` configuration\n3. See the original Agent Zero documentation\n4. Report issues to the Agent Zero repository\n",
"bugtrack_url": null,
"license": null,
"summary": "Lightweight Python AI Agent Framework with Web UI",
"version": "1.0.16",
"project_urls": {
"Bug Reports": "https://github.com/frdel/agent-zero-lite/issues",
"Changelog": "https://github.com/frdel/agent-zero-lite/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/frdel/agent-zero-lite#readme",
"Homepage": "https://github.com/frdel/agent-zero-lite",
"Repository": "https://github.com/frdel/agent-zero-lite"
},
"split_keywords": [
"ai",
" agent",
" llm",
" python",
" automation",
" web-ui",
" litellm",
" openai",
" anthropic"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "aeab101adabd6a167be45e1fdabea6c9cbbf9cbca59585a3def9e8cada37a1a1",
"md5": "5eae183c625883e3f6cc46a25362bed3",
"sha256": "b8b4ad573f8286befcaaaf9b769d24287a5c6df019a87c4b112cc92990f2b9a7"
},
"downloads": -1,
"filename": "agent_zero_lite-1.0.16-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5eae183c625883e3f6cc46a25362bed3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 145374,
"upload_time": "2025-08-07T21:33:35",
"upload_time_iso_8601": "2025-08-07T21:33:35.675985Z",
"url": "https://files.pythonhosted.org/packages/ae/ab/101adabd6a167be45e1fdabea6c9cbbf9cbca59585a3def9e8cada37a1a1/agent_zero_lite-1.0.16-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "2602808d82dfb50dc00ece4022d3d3aa23283850432046f521103e64c96e2e1f",
"md5": "c484abdedab22c56fb412dd41afe47b9",
"sha256": "be7e4573d8849e1c6f0a7436a336d597303046c5e32fc77dd1e8e50919c1c586"
},
"downloads": -1,
"filename": "agent_zero_lite-1.0.16.tar.gz",
"has_sig": false,
"md5_digest": "c484abdedab22c56fb412dd41afe47b9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 4380157,
"upload_time": "2025-08-07T21:33:37",
"upload_time_iso_8601": "2025-08-07T21:33:37.053117Z",
"url": "https://files.pythonhosted.org/packages/26/02/808d82dfb50dc00ece4022d3d3aa23283850432046f521103e64c96e2e1f/agent_zero_lite-1.0.16.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-07 21:33:37",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "frdel",
"github_project": "agent-zero-lite",
"github_not_found": true,
"lcname": "agent-zero-lite"
}