Name | z007 JSON |
Version |
0.1.10
JSON |
| download |
home_page | None |
Summary | Micro agent with tool support and MCP integration. |
upload_time | 2025-09-10 18:46:57 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | MIT |
keywords |
agent
ai
async
bedrock
llm
mcp
tools
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# ⚡ z007 🤖
*Pronounced as "ze double O 7"*
A lightweight and readable agent for interacting with LLM on AWS Bedrock with tool and MCP (Model Context Protocol) support.
## Features
- 🟢 **Ultra Readable**: Clean, maintainable codebase in ~500 lines - easy to understand, modify, and extend
- ⚡ **Super easy**: Just run `uvx z007` with `AWS_PROFILE=<your profile>` in env and start chatting instantly
- ⚡ **Simple Install**: Quick install `uv tool install --upgrade z007` and start chatting instantly `z007` with `AWS_PROFILE=<your profile>` in env
- 🔧 **Tool Support**: Built-in calculator and easily use plain python functions as tools
- 🔌 **MCP Integration**: Connect to Model Context Protocol servers
- 🐍 **Python API**: Easy integration into your Python projects
- 🚀 **Async**: Concurrent tool execution
## Quick Start
### Install and run with uvx (recommended)
```bash
# Install and run directly with AWS_PROFILE configured - fastest way to start!
uvx z007
# Or install globally with AWS_PROFILE configured
uv tool install z007
z007
```

### Install as Python package
```bash
pip install z007
```
## Usage
### Command Line
```bash
# Start interactive chat
z007
# With custom model (AWS Bedrock)
z007 --model-id "anthropic.claude-3-sonnet-20240229-v1:0"
# With MCP configuration
z007 --mcp-config ./mcp.json
```
### Python API
#### Simple usage
```python
import asyncio
from z007 import Agent
async def main():
async with Agent(model_id="openai.gpt-oss-20b-1:0") as agent:
response = await agent.run("What is 2+2?")
print(response)
asyncio.run(main())
```
#### Using the Agent class
```python
import asyncio
from z007 import Agent
async def main():
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
system_prompt="You are a helpful coding assistant."
) as agent:
response = await agent.run("Write a Python function to reverse a string")
print(response)
asyncio.run(main())
```
### Custom Tools
Create your own tools by writing simple Python functions:
```python
import asyncio
from z007 import Agent
def weather_tool(city: str) -> str:
"""Get weather information for a city"""
# In a real implementation, call a weather API
return f"The weather in {city} is sunny, 25°C"
def file_reader_tool(filename: str) -> str:
"""Read contents of a file"""
try:
with open(filename, 'r') as f:
return f.read()
except Exception as e:
return f"Error reading file: {e}"
async def main():
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
tools=[weather_tool, file_reader_tool]
) as agent:
response = await agent.run("What's the weather like in Paris?")
print(response)
asyncio.run(main())
```
### MCP Integration
Connect to Model Context Protocol servers for advanced capabilities:
1. Create `.vscode/mcp.json`:
```json
{
"servers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/project"]
},
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": {
"BRAVE_API_KEY": "${env:BRAVE_API_KEY}"
}
}
}
}
```
2. Use with z007:
```bash
z007 --mcp-config .vscode/mcp.json
```
Or in Python:
```python
from z007 import Agent
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
mcp_config_path=".vscode/mcp.json"
) as agent:
response = await agent.run("Search for recent news about AI")
print(response)
```
## Configuration
### Environment Variables
For AWS Bedrock (default provider):
- `AWS_PROFILE`: AWS profile name
**or**
- `AWS_REGION`: AWS region (default: us-east-1)
- `AWS_ACCESS_KEY_ID`: AWS access key
- `AWS_SECRET_ACCESS_KEY`: AWS secret key
### Supported Models
Current AWS Bedrock models:
- `openai.gpt-oss-20b-1:0` (default)
- Any AWS Bedrock model with tool support
## Interactive Commands
When running `z007` in interactive mode:
- `/help` - Show help
- `/tools` - List available tools
- `/clear` - Clear conversation history
- `/exit` - Exit
## Requirements
- Python 3.9+
- LLM provider credentials (AWS for Bedrock)
## License
MIT License
Raw data
{
"_id": null,
"home_page": null,
"name": "z007",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "agent, ai, async, bedrock, llm, mcp, tools",
"author": null,
"author_email": "Igor Okulist <okigan@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/24/d8/47c18ceac1d978a74f8b774599cb47cfec3dd9662239ac53847713282d38/z007-0.1.10.tar.gz",
"platform": null,
"description": "# \u26a1 z007 \ud83e\udd16\n\n*Pronounced as \"ze double O 7\"*\n\nA lightweight and readable agent for interacting with LLM on AWS Bedrock with tool and MCP (Model Context Protocol) support.\n\n## Features\n\n- \ud83d\udfe2 **Ultra Readable**: Clean, maintainable codebase in ~500 lines - easy to understand, modify, and extend\n- \u26a1 **Super easy**: Just run `uvx z007` with `AWS_PROFILE=<your profile>` in env and start chatting instantly \n- \u26a1 **Simple Install**: Quick install `uv tool install --upgrade z007` and start chatting instantly `z007` with `AWS_PROFILE=<your profile>` in env\n- \ud83d\udd27 **Tool Support**: Built-in calculator and easily use plain python functions as tools\n- \ud83d\udd0c **MCP Integration**: Connect to Model Context Protocol servers\n- \ud83d\udc0d **Python API**: Easy integration into your Python projects\n- \ud83d\ude80 **Async**: Concurrent tool execution\n\n## Quick Start\n\n### Install and run with uvx (recommended)\n\n```bash\n# Install and run directly with AWS_PROFILE configured - fastest way to start!\nuvx z007\n\n# Or install globally with AWS_PROFILE configured\nuv tool install z007\nz007\n```\n\n\n\n\n\n### Install as Python package\n\n```bash\npip install z007\n```\n\n## Usage\n\n### Command Line\n\n```bash\n# Start interactive chat\nz007\n\n# With custom model (AWS Bedrock)\nz007 --model-id \"anthropic.claude-3-sonnet-20240229-v1:0\"\n\n# With MCP configuration\nz007 --mcp-config ./mcp.json\n```\n\n### Python API\n\n#### Simple usage\n\n```python\nimport asyncio\nfrom z007 import Agent\n\nasync def main():\n async with Agent(model_id=\"openai.gpt-oss-20b-1:0\") as agent:\n response = await agent.run(\"What is 2+2?\")\n print(response)\n\nasyncio.run(main())\n```\n\n#### Using the Agent class\n\n```python\nimport asyncio\nfrom z007 import Agent\n\nasync def main():\n async with Agent(\n model_id=\"openai.gpt-oss-20b-1:0\",\n system_prompt=\"You are a helpful coding assistant.\"\n ) as agent:\n response = await agent.run(\"Write a Python function to reverse a string\")\n print(response)\n\nasyncio.run(main())\n```\n\n### Custom Tools\n\nCreate your own tools by writing simple Python functions:\n\n```python\nimport asyncio\nfrom z007 import Agent\n\ndef weather_tool(city: str) -> str:\n \"\"\"Get weather information for a city\"\"\"\n # In a real implementation, call a weather API\n return f\"The weather in {city} is sunny, 25\u00b0C\"\n\ndef file_reader_tool(filename: str) -> str:\n \"\"\"Read contents of a file\"\"\"\n try:\n with open(filename, 'r') as f:\n return f.read()\n except Exception as e:\n return f\"Error reading file: {e}\"\n\nasync def main():\n async with Agent(\n model_id=\"openai.gpt-oss-20b-1:0\",\n tools=[weather_tool, file_reader_tool]\n ) as agent:\n response = await agent.run(\"What's the weather like in Paris?\")\n print(response)\n\nasyncio.run(main())\n```\n\n### MCP Integration\n\nConnect to Model Context Protocol servers for advanced capabilities:\n\n1. Create `.vscode/mcp.json`:\n\n```json\n{\n \"servers\": {\n \"filesystem\": {\n \"command\": \"npx\",\n \"args\": [\"-y\", \"@modelcontextprotocol/server-filesystem\", \"/path/to/project\"]\n },\n \"brave-search\": {\n \"command\": \"npx\",\n \"args\": [\"-y\", \"@modelcontextprotocol/server-brave-search\"],\n \"env\": {\n \"BRAVE_API_KEY\": \"${env:BRAVE_API_KEY}\"\n }\n }\n }\n}\n```\n\n2. Use with z007:\n\n```bash\nz007 --mcp-config .vscode/mcp.json\n```\n\nOr in Python:\n\n```python\nfrom z007 import Agent\n\nasync with Agent(\n model_id=\"openai.gpt-oss-20b-1:0\",\n mcp_config_path=\".vscode/mcp.json\"\n) as agent:\n response = await agent.run(\"Search for recent news about AI\")\n print(response)\n```\n\n## Configuration\n\n### Environment Variables\n\nFor AWS Bedrock (default provider):\n- `AWS_PROFILE`: AWS profile name\n\n **or**\n\n- `AWS_REGION`: AWS region (default: us-east-1)\n- `AWS_ACCESS_KEY_ID`: AWS access key\n- `AWS_SECRET_ACCESS_KEY`: AWS secret key\n\n### Supported Models\n\nCurrent AWS Bedrock models:\n- `openai.gpt-oss-20b-1:0` (default)\n- Any AWS Bedrock model with tool support\n\n## Interactive Commands\n\nWhen running `z007` in interactive mode:\n\n- `/help` - Show help\n- `/tools` - List available tools \n- `/clear` - Clear conversation history\n- `/exit` - Exit\n\n## Requirements\n\n- Python 3.9+\n- LLM provider credentials (AWS for Bedrock)\n\n## License\n\nMIT License\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Micro agent with tool support and MCP integration.",
"version": "0.1.10",
"project_urls": {
"Changelog": "https://github.com/okigan/z007/releases",
"Homepage": "https://github.com/okigan/z007",
"Issues": "https://github.com/okigan/z007/issues",
"Repository": "https://github.com/okigan/z007"
},
"split_keywords": [
"agent",
" ai",
" async",
" bedrock",
" llm",
" mcp",
" tools"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "32afc36311965532ccbc5913a987ecebd06bee6372c89f5dbeed7af7b2994f45",
"md5": "cf94a7d0af53a93b0de06aa70c376e4e",
"sha256": "578f0f8776f85a7a00412d7d03c5d708bb9aaa302f152258f3c65faf7032d7ca"
},
"downloads": -1,
"filename": "z007-0.1.10-py3-none-any.whl",
"has_sig": false,
"md5_digest": "cf94a7d0af53a93b0de06aa70c376e4e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 13600,
"upload_time": "2025-09-10T18:46:56",
"upload_time_iso_8601": "2025-09-10T18:46:56.105110Z",
"url": "https://files.pythonhosted.org/packages/32/af/c36311965532ccbc5913a987ecebd06bee6372c89f5dbeed7af7b2994f45/z007-0.1.10-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "24d847c18ceac1d978a74f8b774599cb47cfec3dd9662239ac53847713282d38",
"md5": "089cfd9c191da8cd1537702e53c577e9",
"sha256": "f86e584db50c4d236e40035e39b85b19ef217542d587c730c8eb63d8b594d779"
},
"downloads": -1,
"filename": "z007-0.1.10.tar.gz",
"has_sig": false,
"md5_digest": "089cfd9c191da8cd1537702e53c577e9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 373774,
"upload_time": "2025-09-10T18:46:57",
"upload_time_iso_8601": "2025-09-10T18:46:57.532848Z",
"url": "https://files.pythonhosted.org/packages/24/d8/47c18ceac1d978a74f8b774599cb47cfec3dd9662239ac53847713282d38/z007-0.1.10.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-10 18:46:57",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "okigan",
"github_project": "z007",
"github_not_found": true,
"lcname": "z007"
}