| Name | hanzo JSON |
| Version |
0.3.31
JSON |
| download |
| home_page | None |
| Summary | Hanzo AI - Complete AI Infrastructure Platform with CLI, Router, MCP, and Agent Runtime |
| upload_time | 2025-09-07 20:42:39 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | >=3.10 |
| license | None |
| keywords |
agents
ai
cli
hanzo
llm
local-ai
mcp
private-ai
|
| VCS |
 |
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# Hanzo CLI and Orchestration Tools
[](https://pypi.org/project/hanzo/)
[](https://pypi.org/project/hanzo/)
Core CLI and orchestration tools for the Hanzo AI platform.
## Installation
```bash
pip install hanzo
```
## Features
- **Interactive Chat**: Chat with AI models through CLI
- **Node Management**: Run local AI inference nodes
- **Router Control**: Manage LLM proxy router
- **REPL Interface**: Interactive Python REPL with AI
- **Batch Orchestration**: Orchestrate multiple AI tasks
- **Memory Management**: Persistent conversation memory
## Usage
### CLI Commands
```bash
# Interactive chat
hanzo chat
# Use specific model
hanzo chat --model gpt-4
# Use router (local proxy)
hanzo chat --router
# Use cloud API
hanzo chat --cloud
```
### Node Management
```bash
# Start local node
hanzo node start
# Check status
hanzo node status
# List available models
hanzo node models
# Load specific model
hanzo node load llama2:7b
# Stop node
hanzo node stop
```
### Router Management
```bash
# Start router proxy
hanzo router start
# Check router status
hanzo router status
# List available models
hanzo router models
# View configuration
hanzo router config
# Stop router
hanzo router stop
```
### Interactive REPL
```bash
# Start REPL
hanzo repl
# In REPL:
> /help # Show help
> /models # List models
> /model gpt-4 # Switch model
> /clear # Clear context
> What is Python? # Ask questions
```
## Python API
### Batch Orchestration
```python
from hanzo.batch_orchestrator import BatchOrchestrator
orchestrator = BatchOrchestrator()
results = await orchestrator.run_batch([
"Summarize quantum computing",
"Explain machine learning",
"Define artificial intelligence"
])
```
### Memory Management
```python
from hanzo.memory_manager import MemoryManager
memory = MemoryManager()
memory.add_to_context("user", "What is Python?")
memory.add_to_context("assistant", "Python is...")
context = memory.get_context()
```
### Fallback Handling
```python
from hanzo.fallback_handler import FallbackHandler
handler = FallbackHandler()
result = await handler.handle_with_fallback(
primary_fn=api_call,
fallback_fn=local_inference
)
```
## Configuration
### Environment Variables
```bash
# API settings
HANZO_API_KEY=your-api-key
HANZO_BASE_URL=https://api.hanzo.ai
# Router settings
HANZO_ROUTER_URL=http://localhost:4000/v1
# Node settings
HANZO_NODE_URL=http://localhost:8000/v1
HANZO_NODE_WORKERS=4
# Model preferences
HANZO_DEFAULT_MODEL=gpt-4
HANZO_FALLBACK_MODEL=llama2:7b
```
### Configuration File
Create `~/.hanzo/config.yaml`:
```yaml
api:
key: your-api-key
base_url: https://api.hanzo.ai
router:
url: http://localhost:4000/v1
auto_start: true
node:
url: http://localhost:8000/v1
workers: 4
models:
- llama2:7b
- mistral:7b
models:
default: gpt-4
fallback: llama2:7b
```
## Architecture
### Components
- **CLI**: Command-line interface (`cli.py`)
- **Chat**: Interactive chat interface (`commands/chat.py`)
- **Node**: Local AI node management (`commands/node.py`)
- **Router**: LLM proxy management (`commands/router.py`)
- **REPL**: Interactive Python REPL (`interactive/repl.py`)
- **Orchestrator**: Batch task orchestration (`batch_orchestrator.py`)
- **Memory**: Conversation memory (`memory_manager.py`)
- **Fallback**: Resilient API handling (`fallback_handler.py`)
### Port Allocation
- **4000**: Router (LLM proxy)
- **8000**: Node (local AI)
- **9550-9553**: Desktop app integration
## Development
### Setup
```bash
cd pkg/hanzo
uv sync --all-extras
```
### Testing
```bash
# Run tests
pytest tests/
# With coverage
pytest tests/ --cov=hanzo
```
### Building
```bash
uv build
```
## License
Apache License 2.0
Raw data
{
"_id": null,
"home_page": null,
"name": "hanzo",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "agents, ai, cli, hanzo, llm, local-ai, mcp, private-ai",
"author": null,
"author_email": "Hanzo AI <dev@hanzo.ai>",
"download_url": "https://files.pythonhosted.org/packages/94/3f/eab1db7ebf87f99b80220157a68cb5db60be1abf72ec59f01b1aeb12ffed/hanzo-0.3.31.tar.gz",
"platform": null,
"description": "# Hanzo CLI and Orchestration Tools\n\n[](https://pypi.org/project/hanzo/)\n[](https://pypi.org/project/hanzo/)\n\nCore CLI and orchestration tools for the Hanzo AI platform.\n\n## Installation\n\n```bash\npip install hanzo\n```\n\n## Features\n\n- **Interactive Chat**: Chat with AI models through CLI\n- **Node Management**: Run local AI inference nodes\n- **Router Control**: Manage LLM proxy router\n- **REPL Interface**: Interactive Python REPL with AI\n- **Batch Orchestration**: Orchestrate multiple AI tasks\n- **Memory Management**: Persistent conversation memory\n\n## Usage\n\n### CLI Commands\n\n```bash\n# Interactive chat\nhanzo chat\n\n# Use specific model\nhanzo chat --model gpt-4\n\n# Use router (local proxy)\nhanzo chat --router\n\n# Use cloud API\nhanzo chat --cloud\n```\n\n### Node Management\n\n```bash\n# Start local node\nhanzo node start\n\n# Check status\nhanzo node status\n\n# List available models\nhanzo node models\n\n# Load specific model\nhanzo node load llama2:7b\n\n# Stop node\nhanzo node stop\n```\n\n### Router Management\n\n```bash\n# Start router proxy\nhanzo router start\n\n# Check router status\nhanzo router status\n\n# List available models\nhanzo router models\n\n# View configuration\nhanzo router config\n\n# Stop router\nhanzo router stop\n```\n\n### Interactive REPL\n\n```bash\n# Start REPL\nhanzo repl\n\n# In REPL:\n> /help # Show help\n> /models # List models\n> /model gpt-4 # Switch model\n> /clear # Clear context\n> What is Python? # Ask questions\n```\n\n## Python API\n\n### Batch Orchestration\n\n```python\nfrom hanzo.batch_orchestrator import BatchOrchestrator\n\norchestrator = BatchOrchestrator()\nresults = await orchestrator.run_batch([\n \"Summarize quantum computing\",\n \"Explain machine learning\",\n \"Define artificial intelligence\"\n])\n```\n\n### Memory Management\n\n```python\nfrom hanzo.memory_manager import MemoryManager\n\nmemory = MemoryManager()\nmemory.add_to_context(\"user\", \"What is Python?\")\nmemory.add_to_context(\"assistant\", \"Python is...\")\ncontext = memory.get_context()\n```\n\n### Fallback Handling\n\n```python\nfrom hanzo.fallback_handler import FallbackHandler\n\nhandler = FallbackHandler()\nresult = await handler.handle_with_fallback(\n primary_fn=api_call,\n fallback_fn=local_inference\n)\n```\n\n## Configuration\n\n### Environment Variables\n\n```bash\n# API settings\nHANZO_API_KEY=your-api-key\nHANZO_BASE_URL=https://api.hanzo.ai\n\n# Router settings\nHANZO_ROUTER_URL=http://localhost:4000/v1\n\n# Node settings\nHANZO_NODE_URL=http://localhost:8000/v1\nHANZO_NODE_WORKERS=4\n\n# Model preferences\nHANZO_DEFAULT_MODEL=gpt-4\nHANZO_FALLBACK_MODEL=llama2:7b\n```\n\n### Configuration File\n\nCreate `~/.hanzo/config.yaml`:\n\n```yaml\napi:\n key: your-api-key\n base_url: https://api.hanzo.ai\n\nrouter:\n url: http://localhost:4000/v1\n auto_start: true\n\nnode:\n url: http://localhost:8000/v1\n workers: 4\n models:\n - llama2:7b\n - mistral:7b\n\nmodels:\n default: gpt-4\n fallback: llama2:7b\n```\n\n## Architecture\n\n### Components\n\n- **CLI**: Command-line interface (`cli.py`)\n- **Chat**: Interactive chat interface (`commands/chat.py`)\n- **Node**: Local AI node management (`commands/node.py`)\n- **Router**: LLM proxy management (`commands/router.py`)\n- **REPL**: Interactive Python REPL (`interactive/repl.py`)\n- **Orchestrator**: Batch task orchestration (`batch_orchestrator.py`)\n- **Memory**: Conversation memory (`memory_manager.py`)\n- **Fallback**: Resilient API handling (`fallback_handler.py`)\n\n### Port Allocation\n\n- **4000**: Router (LLM proxy)\n- **8000**: Node (local AI)\n- **9550-9553**: Desktop app integration\n\n## Development\n\n### Setup\n\n```bash\ncd pkg/hanzo\nuv sync --all-extras\n```\n\n### Testing\n\n```bash\n# Run tests\npytest tests/\n\n# With coverage\npytest tests/ --cov=hanzo\n```\n\n### Building\n\n```bash\nuv build\n```\n\n## License\n\nApache License 2.0",
"bugtrack_url": null,
"license": null,
"summary": "Hanzo AI - Complete AI Infrastructure Platform with CLI, Router, MCP, and Agent Runtime",
"version": "0.3.31",
"project_urls": {
"Bug Tracker": "https://github.com/hanzoai/python-sdk/issues",
"Documentation": "https://docs.hanzo.ai/cli",
"Homepage": "https://hanzo.ai",
"Repository": "https://github.com/hanzoai/python-sdk"
},
"split_keywords": [
"agents",
" ai",
" cli",
" hanzo",
" llm",
" local-ai",
" mcp",
" private-ai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "8a5e0986fcee216a85af0e5f19eeb5071870913b9b7ba1c9a85cb71f5dd77260",
"md5": "ec4480865b279f2ccfcf47b4b51be6fc",
"sha256": "589f96d88de490517565b7ca733acaa615fa82576bc8a5f6dd007ce82195d9c1"
},
"downloads": -1,
"filename": "hanzo-0.3.31-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ec4480865b279f2ccfcf47b4b51be6fc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 124854,
"upload_time": "2025-09-07T20:42:38",
"upload_time_iso_8601": "2025-09-07T20:42:38.231063Z",
"url": "https://files.pythonhosted.org/packages/8a/5e/0986fcee216a85af0e5f19eeb5071870913b9b7ba1c9a85cb71f5dd77260/hanzo-0.3.31-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "943feab1db7ebf87f99b80220157a68cb5db60be1abf72ec59f01b1aeb12ffed",
"md5": "8a0e758fc89b1cd867498c2ad3476cc9",
"sha256": "65eba8613a114de23e7c82de421c2d32b75271d1a65916831928a37736f4eef1"
},
"downloads": -1,
"filename": "hanzo-0.3.31.tar.gz",
"has_sig": false,
"md5_digest": "8a0e758fc89b1cd867498c2ad3476cc9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 101150,
"upload_time": "2025-09-07T20:42:39",
"upload_time_iso_8601": "2025-09-07T20:42:39.358901Z",
"url": "https://files.pythonhosted.org/packages/94/3f/eab1db7ebf87f99b80220157a68cb5db60be1abf72ec59f01b1aeb12ffed/hanzo-0.3.31.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-07 20:42:39",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "hanzoai",
"github_project": "python-sdk",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "hanzo"
}