| Name | z007 JSON |
| Version |
0.2.1
JSON |
| download |
| home_page | None |
| Summary | Micro agent with tool support and MCP integration. |
| upload_time | 2025-09-15 18:17:54 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | >=3.9 |
| license | MIT |
| keywords |
agent
ai
async
bedrock
llm
mcp
tools
|
| VCS |
 |
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# ⚡ z007 🤖: Nimble AI Agent
_pronounced: "zee-double-oh-seven"_
A lightweight and readable agent for interacting with LLM on AWS Bedrock with tool and MCP (Model Context Protocol) support.
## Features
- 🟢 **Ultra Readable**: Clean, maintainable codebase in about 600 lines - easy to understand, modify, and extend
- ⚡ **Super easy**: Just run `uvx z007@latest` with `AWS_PROFILE=<your profile>` in env and start chatting instantly
- ⚡ **Simple Install**: Quick install `uv tool install --upgrade z007` and start chatting instantly `z007` with `AWS_PROFILE=<your profile>` in env
- 🔧 **Tool Support**: Built-in calculator and easily use plain python functions as tools
- 🔌 **MCP Integration**: Connect to Model Context Protocol servers
- 🐍 **Python API**: Easy integration into your Python projects
- 🚀 **Async**: Concurrent tool execution
## Quick Start
### Install and run with uvx (recommended)
```bash
```bash
# Install and run directly with AWS_PROFILE configured - fastest way to start!
AWS_PROFILE=your-profile uvx z007@latest
# Or install globally
uv tool install z007
AWS_PROFILE=your-profile z007
```

### Install as Python package
```bash
pip install z007
```
## Usage
### Command Line
```bash
# Start interactive chat
z007
# With custom model (AWS Bedrock)
AWS_PROFILE=your-profile z007 --model-id "openai.gpt-oss-120b-1:0"
# With MCP configuration
z007 --mcp-config ./mcp.json
```
### Python API
#### Simple usage
```python
import asyncio
from z007 import Agent, create_calculator_tool
async def main():
calculator = create_calculator_tool()
async with Agent(model_id="openai.gpt-oss-20b-1:0", tools=[calculator]) as agent:
response = await agent.run("What is 2+2?")
print(response)
asyncio.run(main())
```
#### Using the Agent class
```python
import asyncio
from z007 import Agent, create_calculator_tool
async def main():
calculator = create_calculator_tool()
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
system_prompt="You are a helpful coding assistant.",
tools=[calculator]
) as agent:
response = await agent.run("Write a Python function to reverse a string")
print(response)
asyncio.run(main())
```
### Custom Tools
Create your own tools by writing simple Python functions:
```python
import asyncio
from z007 import Agent
def weather_tool(city: str) -> str:
"""Get weather information for a city"""
# In a real implementation, call a weather API
return f"The weather in {city} is sunny, 25°C"
def file_reader_tool(filename: str) -> str:
"""Read contents of a file"""
try:
with open(filename, 'r') as f:
return f.read()
except Exception as e:
return f"Error reading file: {e}"
async def main():
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
tools=[weather_tool, file_reader_tool]
) as agent:
response = await agent.run("What's the weather like in Paris?")
print(response)
asyncio.run(main())
```
### MCP Integration
Connect to Model Context Protocol servers for advanced capabilities:
1. Create `mcp.json`:
```json
{
"servers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/project"]
},
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": {
"BRAVE_API_KEY": "${env:BRAVE_API_KEY}"
}
},
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"]
}
}
}
```
2. Use with z007:
```bash
z007 --mcp-config mcp.json
```
Or in Python:
```python
import json
from z007 import Agent
# Load MCP config
with open("mcp.json") as f:
mcp_config = json.load(f)
async with Agent(
model_id="openai.gpt-oss-20b-1:0",
mcp_config=mcp_config
) as agent:
response = await agent.run("Search for recent news about AI")
print(response)
```
## Configuration
### Environment Variables
For AWS Bedrock (default provider):
- `AWS_PROFILE`: AWS profile name (e.g., `AWS_PROFILE=codemobs`)
**or**
- `AWS_REGION`: AWS region (default: us-east-1)
- `AWS_ACCESS_KEY_ID`: AWS access key
- `AWS_SECRET_ACCESS_KEY`: AWS secret key
### Supported Models
AWS Bedrock models with verified access:
- `openai.gpt-oss-20b-1:0` (default)
Note: Model availability depends on your AWS account's Bedrock access permissions. Use `AWS_PROFILE=your-profile` to specify credentials.
- Any AWS Bedrock model with tool support
## Interactive Commands
When running `z007` in interactive mode:
- `/help` - Show help
- `/tools` - List available tools
- `/clear` - Clear conversation history
- `/exit` - Exit
## Requirements
- Python 3.9+
- LLM provider credentials (AWS for Bedrock)
## License
MIT License
Raw data
{
"_id": null,
"home_page": null,
"name": "z007",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "agent, ai, async, bedrock, llm, mcp, tools",
"author": null,
"author_email": "Igor Okulist <okigan@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/60/f1/4788cd10047d38b984497984586fd3af3dcb4a544f01f552bb275a05a56c/z007-0.2.1.tar.gz",
"platform": null,
"description": "# \u26a1 z007 \ud83e\udd16: Nimble AI Agent\n_pronounced: \"zee-double-oh-seven\"_ \n\nA lightweight and readable agent for interacting with LLM on AWS Bedrock with tool and MCP (Model Context Protocol) support.\n\n## Features\n\n- \ud83d\udfe2 **Ultra Readable**: Clean, maintainable codebase in about 600 lines - easy to understand, modify, and extend\n- \u26a1 **Super easy**: Just run `uvx z007@latest` with `AWS_PROFILE=<your profile>` in env and start chatting instantly \n- \u26a1 **Simple Install**: Quick install `uv tool install --upgrade z007` and start chatting instantly `z007` with `AWS_PROFILE=<your profile>` in env\n- \ud83d\udd27 **Tool Support**: Built-in calculator and easily use plain python functions as tools\n- \ud83d\udd0c **MCP Integration**: Connect to Model Context Protocol servers\n- \ud83d\udc0d **Python API**: Easy integration into your Python projects\n- \ud83d\ude80 **Async**: Concurrent tool execution\n\n## Quick Start\n\n### Install and run with uvx (recommended)\n\n```bash\n```bash\n# Install and run directly with AWS_PROFILE configured - fastest way to start!\nAWS_PROFILE=your-profile uvx z007@latest\n\n# Or install globally\nuv tool install z007\nAWS_PROFILE=your-profile z007\n```\n\n\n\n\n\n### Install as Python package\n\n```bash\npip install z007\n```\n\n## Usage\n\n### Command Line\n\n```bash\n# Start interactive chat\nz007\n\n# With custom model (AWS Bedrock)\nAWS_PROFILE=your-profile z007 --model-id \"openai.gpt-oss-120b-1:0\"\n\n# With MCP configuration\nz007 --mcp-config ./mcp.json\n```\n\n### Python API\n\n#### Simple usage\n\n```python\nimport asyncio\nfrom z007 import Agent, create_calculator_tool\n\nasync def main():\n calculator = create_calculator_tool()\n async with Agent(model_id=\"openai.gpt-oss-20b-1:0\", tools=[calculator]) as agent:\n response = await agent.run(\"What is 2+2?\")\n print(response)\n\nasyncio.run(main())\n```\n\n#### Using the Agent class\n\n```python\nimport asyncio\nfrom z007 import Agent, create_calculator_tool\n\nasync def main():\n calculator = create_calculator_tool()\n async with Agent(\n model_id=\"openai.gpt-oss-20b-1:0\",\n system_prompt=\"You are a helpful coding assistant.\",\n tools=[calculator]\n ) as agent:\n response = await agent.run(\"Write a Python function to reverse a string\")\n print(response)\n\nasyncio.run(main())\n```\n\n### Custom Tools\n\nCreate your own tools by writing simple Python functions:\n\n```python\nimport asyncio\nfrom z007 import Agent\n\ndef weather_tool(city: str) -> str:\n \"\"\"Get weather information for a city\"\"\"\n # In a real implementation, call a weather API\n return f\"The weather in {city} is sunny, 25\u00b0C\"\n\ndef file_reader_tool(filename: str) -> str:\n \"\"\"Read contents of a file\"\"\"\n try:\n with open(filename, 'r') as f:\n return f.read()\n except Exception as e:\n return f\"Error reading file: {e}\"\n\nasync def main():\n async with Agent(\n model_id=\"openai.gpt-oss-20b-1:0\",\n tools=[weather_tool, file_reader_tool]\n ) as agent:\n response = await agent.run(\"What's the weather like in Paris?\")\n print(response)\n\nasyncio.run(main())\n```\n\n### MCP Integration\n\nConnect to Model Context Protocol servers for advanced capabilities:\n\n1. Create `mcp.json`:\n\n```json\n{\n \"servers\": {\n \"filesystem\": {\n \"command\": \"npx\",\n \"args\": [\"-y\", \"@modelcontextprotocol/server-filesystem\", \"/path/to/project\"]\n },\n \"brave-search\": {\n \"command\": \"npx\",\n \"args\": [\"-y\", \"@modelcontextprotocol/server-brave-search\"],\n \"env\": {\n \"BRAVE_API_KEY\": \"${env:BRAVE_API_KEY}\"\n }\n },\n \"playwright\": {\n \"command\": \"npx\",\n \"args\": [\"@playwright/mcp@latest\"]\n }\n }\n}\n```\n\n2. Use with z007:\n\n```bash\nz007 --mcp-config mcp.json\n```\n\nOr in Python:\n\n```python\nimport json\nfrom z007 import Agent\n\n# Load MCP config\nwith open(\"mcp.json\") as f:\n mcp_config = json.load(f)\n\nasync with Agent(\n model_id=\"openai.gpt-oss-20b-1:0\",\n mcp_config=mcp_config\n) as agent:\n response = await agent.run(\"Search for recent news about AI\")\n print(response)\n```\n\n## Configuration\n\n### Environment Variables\n\nFor AWS Bedrock (default provider):\n- `AWS_PROFILE`: AWS profile name (e.g., `AWS_PROFILE=codemobs`)\n\n **or**\n\n- `AWS_REGION`: AWS region (default: us-east-1)\n- `AWS_ACCESS_KEY_ID`: AWS access key\n- `AWS_SECRET_ACCESS_KEY`: AWS secret key\n\n### Supported Models\n\nAWS Bedrock models with verified access:\n- `openai.gpt-oss-20b-1:0` (default)\n\nNote: Model availability depends on your AWS account's Bedrock access permissions. Use `AWS_PROFILE=your-profile` to specify credentials.\n- Any AWS Bedrock model with tool support\n\n## Interactive Commands\n\nWhen running `z007` in interactive mode:\n\n- `/help` - Show help\n- `/tools` - List available tools \n- `/clear` - Clear conversation history\n- `/exit` - Exit\n\n## Requirements\n\n- Python 3.9+\n- LLM provider credentials (AWS for Bedrock)\n\n## License\n\nMIT License\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Micro agent with tool support and MCP integration.",
"version": "0.2.1",
"project_urls": {
"Changelog": "https://github.com/okigan/z007/releases",
"Homepage": "https://github.com/okigan/z007",
"Issues": "https://github.com/okigan/z007/issues",
"Repository": "https://github.com/okigan/z007"
},
"split_keywords": [
"agent",
" ai",
" async",
" bedrock",
" llm",
" mcp",
" tools"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "13bef1d44e1597a046e71c5fde1fe654858676bc0fd5501abcf94c7ac5af7827",
"md5": "7d9c14fa822d44825e68acaa1b7e73f6",
"sha256": "e394793b1b0fec4bb94d2ba8b07eee2b0a9e5ce2e4eb523da5ca44d5d8cfc55e"
},
"downloads": -1,
"filename": "z007-0.2.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7d9c14fa822d44825e68acaa1b7e73f6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 15807,
"upload_time": "2025-09-15T18:17:53",
"upload_time_iso_8601": "2025-09-15T18:17:53.823757Z",
"url": "https://files.pythonhosted.org/packages/13/be/f1d44e1597a046e71c5fde1fe654858676bc0fd5501abcf94c7ac5af7827/z007-0.2.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "60f14788cd10047d38b984497984586fd3af3dcb4a544f01f552bb275a05a56c",
"md5": "2d70884140fb9476f9a50d6ba955ec35",
"sha256": "d89562c747ed071eb7ae0c51c9201b165e9557d4ac5b93fdfa565ee755938a0c"
},
"downloads": -1,
"filename": "z007-0.2.1.tar.gz",
"has_sig": false,
"md5_digest": "2d70884140fb9476f9a50d6ba955ec35",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 379418,
"upload_time": "2025-09-15T18:17:54",
"upload_time_iso_8601": "2025-09-15T18:17:54.846542Z",
"url": "https://files.pythonhosted.org/packages/60/f1/4788cd10047d38b984497984586fd3af3dcb4a544f01f552bb275a05a56c/z007-0.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-15 18:17:54",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "okigan",
"github_project": "z007",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "z007"
}