# DurusAI Native CLI
🤖 Native CLI client for DurusAI - AI-powered development assistant
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
[](https://badge.fury.io/py/durusai)
## Features
- 🔐 **Secure authentication** with JWT tokens and keyring storage
- 💬 **Interactive chat mode** with AI models (Claude, GPT-4, Gemini)
- 🚀 **Single command queries** for quick AI assistance
- 📊 **Usage statistics** and token tracking
- 🔧 **Multiple AI models** support
- 🎨 **Rich terminal UI** with markdown rendering
- ⚙️ **Configurable profiles** for different environments
- 🌐 **Cross-platform** (Linux, macOS, Windows)
## Installation
### From PyPI (Recommended)
```bash
pip install durusai
```
### From source
```bash
git clone https://github.com/durusai/cli.git
cd cli/durusai_native_cli
pip install -e .
```
### Development installation
```bash
pip install -e ".[dev]"
```
## Quick Start
### 1. Login to DurusAI
```bash
durusai login
```
### 2. Ask a question
```bash
durusai query "Explain Python decorators"
```
### 3. Start interactive chat
```bash
durusai chat
```
## Usage
### Authentication
```bash
# Login with username/password
durusai login
# Login with specific profile
durusai login --profile work
# Check current user
durusai whoami
# Logout
durusai logout
```
### Queries
```bash
# Single query
durusai query "How to implement binary search in Python?"
# Query with specific model
durusai query "Explain async/await" --model gpt-4
# Query with custom parameters
durusai query "Write a FastAPI endpoint" --max-tokens 2000 --temperature 0.7
```
### Interactive Mode
```bash
# Start interactive chat
durusai chat
# Chat with specific model
durusai chat --model claude-3-sonnet
```
Interactive commands:
- `/help` - Show help
- `/model <name>` - Switch AI model
- `/clear` - Clear conversation history
- `/history` - Show conversation history
- `/stats` - Show session statistics
- `/quit` - Exit chat mode
### Models and Statistics
```bash
# List available models
durusai models
# Show usage statistics
durusai stats
# Check API health
durusai health
```
### Configuration
```bash
# Show all settings
durusai config --list
# Set API endpoint
durusai config api_endpoint "https://api.durusai.com"
# Set default model
durusai config settings.default_model "claude-3-sonnet"
# Enable streaming responses
durusai config settings.stream_responses true
```
## Configuration
DurusAI CLI stores configuration in `~/.durusai/`:
```
~/.durusai/
├── config.json # Main configuration
├── profiles/ # User profiles
│ ├── default.json
│ └── work.json
├── cache/ # Response cache
├── history/ # Command history
└── logs/ # Application logs
```
### Configuration Options
```json
{
"api_endpoint": "https://api.durusai.com",
"default_profile": "default",
"settings": {
"timeout": 30,
"retry_count": 3,
"stream_responses": true,
"cache_ttl": 3600,
"auto_update": true,
"show_token_usage": true,
"default_model": "claude-3-sonnet-20240229",
"max_history_size": 1000
},
"display": {
"use_colors": true,
"show_timestamps": false,
"markdown_rendering": true,
"pager_enabled": true
}
}
```
## API Models
DurusAI supports multiple AI providers:
| Model | Provider | Context Length | Status |
|-------|----------|----------------|---------|
| claude-3-sonnet-20240229 | Anthropic | 200K | ✅ |
| claude-3-haiku-20240307 | Anthropic | 200K | ✅ |
| gpt-4 | OpenAI | 8K | ✅ |
| gpt-4-turbo | OpenAI | 128K | ✅ |
| gemini-pro | Google | 32K | ✅ |
## Examples
### Code Generation
```bash
durusai query "Create a REST API endpoint for user registration with FastAPI"
```
### Code Explanation
```bash
durusai query "Explain this Python code" < my_script.py
```
### Interactive Debugging
```bash
durusai chat
# Then in chat:
# User: I'm getting a TypeError in my Python code
# AI: I'd be happy to help! Please share the error and the relevant code...
```
### Batch Processing
```bash
# Process multiple files
for file in *.py; do
durusai query "Review this code for bugs: $(cat $file)" > "${file%.py}_review.md"
done
```
## Environment Variables
- `DURUSAI_API_ENDPOINT` - API endpoint URL
- `DURUSAI_API_TOKEN` - API authentication token
- `DURUSAI_CONFIG_DIR` - Configuration directory (default: ~/.durusai)
- `DURUSAI_PROFILE` - Default profile name
## Security
- **Tokens** are stored securely using system keyring (macOS Keychain, Windows Credential Manager, Linux Secret Service)
- **Fallback encryption** using Fernet for systems without keyring
- **No plaintext passwords** stored locally
- **JWT tokens** with automatic refresh
- **TLS/SSL** for all API communications
## Development
### Setup
```bash
git clone https://github.com/durusai/cli.git
cd cli/durusai_native_cli
pip install -e ".[dev]"
```
### Testing
```bash
pytest
```
### Code formatting
```bash
black durusai/
flake8 durusai/
mypy durusai/
```
### Building
```bash
python -m build
```
## Contributing
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests
5. Run the test suite
6. Create a pull request
## License
MIT License - see [LICENSE](LICENSE) file.
## Support
- 📧 Email: support@durusai.com
- 💬 GitHub Issues: [durusai/cli/issues](https://github.com/durusai/cli/issues)
- 📖 Documentation: [docs.durusai.com](https://docs.durusai.com)
## Changelog
See [CHANGELOG.md](CHANGELOG.md) for release history.
---
Made with ❤️ by the DurusAI team
Raw data
{
"_id": null,
"home_page": "https://github.com/durusai/cli",
"name": "durusai-cli",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "DurusAI Team <support@durusai.com>",
"keywords": "ai, cli, assistant, development, api, anthropic, openai, terminal",
"author": "DurusAI Team",
"author_email": "DurusAI Team <support@durusai.com>",
"download_url": "https://files.pythonhosted.org/packages/1f/5c/ed55925f71a50dbf6c605e86cbbe156c0e0034a8ca025551293b6c3b231a/durusai_cli-2.4.0.tar.gz",
"platform": null,
"description": "# DurusAI Native CLI\n\n\ud83e\udd16 Native CLI client for DurusAI - AI-powered development assistant\n\n[](https://www.python.org/downloads/)\n[](https://opensource.org/licenses/MIT)\n[](https://badge.fury.io/py/durusai)\n\n## Features\n\n- \ud83d\udd10 **Secure authentication** with JWT tokens and keyring storage\n- \ud83d\udcac **Interactive chat mode** with AI models (Claude, GPT-4, Gemini)\n- \ud83d\ude80 **Single command queries** for quick AI assistance \n- \ud83d\udcca **Usage statistics** and token tracking\n- \ud83d\udd27 **Multiple AI models** support\n- \ud83c\udfa8 **Rich terminal UI** with markdown rendering\n- \u2699\ufe0f **Configurable profiles** for different environments\n- \ud83c\udf10 **Cross-platform** (Linux, macOS, Windows)\n\n## Installation\n\n### From PyPI (Recommended)\n\n```bash\npip install durusai\n```\n\n### From source\n\n```bash\ngit clone https://github.com/durusai/cli.git\ncd cli/durusai_native_cli\npip install -e .\n```\n\n### Development installation\n\n```bash\npip install -e \".[dev]\"\n```\n\n## Quick Start\n\n### 1. Login to DurusAI\n\n```bash\ndurusai login\n```\n\n### 2. Ask a question\n\n```bash\ndurusai query \"Explain Python decorators\"\n```\n\n### 3. Start interactive chat\n\n```bash\ndurusai chat\n```\n\n## Usage\n\n### Authentication\n\n```bash\n# Login with username/password\ndurusai login\n\n# Login with specific profile\ndurusai login --profile work\n\n# Check current user\ndurusai whoami\n\n# Logout\ndurusai logout\n```\n\n### Queries\n\n```bash\n# Single query\ndurusai query \"How to implement binary search in Python?\"\n\n# Query with specific model\ndurusai query \"Explain async/await\" --model gpt-4\n\n# Query with custom parameters \ndurusai query \"Write a FastAPI endpoint\" --max-tokens 2000 --temperature 0.7\n```\n\n### Interactive Mode\n\n```bash\n# Start interactive chat\ndurusai chat\n\n# Chat with specific model\ndurusai chat --model claude-3-sonnet\n```\n\nInteractive commands:\n- `/help` - Show help\n- `/model <name>` - Switch AI model\n- `/clear` - Clear conversation history\n- `/history` - Show conversation history\n- `/stats` - Show session statistics\n- `/quit` - Exit chat mode\n\n### Models and Statistics\n\n```bash\n# List available models\ndurusai models\n\n# Show usage statistics\ndurusai stats\n\n# Check API health\ndurusai health\n```\n\n### Configuration\n\n```bash\n# Show all settings\ndurusai config --list\n\n# Set API endpoint\ndurusai config api_endpoint \"https://api.durusai.com\"\n\n# Set default model\ndurusai config settings.default_model \"claude-3-sonnet\"\n\n# Enable streaming responses\ndurusai config settings.stream_responses true\n```\n\n## Configuration\n\nDurusAI CLI stores configuration in `~/.durusai/`:\n\n```\n~/.durusai/\n\u251c\u2500\u2500 config.json # Main configuration\n\u251c\u2500\u2500 profiles/ # User profiles \n\u2502 \u251c\u2500\u2500 default.json\n\u2502 \u2514\u2500\u2500 work.json\n\u251c\u2500\u2500 cache/ # Response cache\n\u251c\u2500\u2500 history/ # Command history\n\u2514\u2500\u2500 logs/ # Application logs\n```\n\n### Configuration Options\n\n```json\n{\n \"api_endpoint\": \"https://api.durusai.com\",\n \"default_profile\": \"default\",\n \"settings\": {\n \"timeout\": 30,\n \"retry_count\": 3,\n \"stream_responses\": true,\n \"cache_ttl\": 3600,\n \"auto_update\": true,\n \"show_token_usage\": true,\n \"default_model\": \"claude-3-sonnet-20240229\",\n \"max_history_size\": 1000\n },\n \"display\": {\n \"use_colors\": true,\n \"show_timestamps\": false,\n \"markdown_rendering\": true,\n \"pager_enabled\": true\n }\n}\n```\n\n## API Models\n\nDurusAI supports multiple AI providers:\n\n| Model | Provider | Context Length | Status |\n|-------|----------|----------------|---------|\n| claude-3-sonnet-20240229 | Anthropic | 200K | \u2705 |\n| claude-3-haiku-20240307 | Anthropic | 200K | \u2705 |\n| gpt-4 | OpenAI | 8K | \u2705 |\n| gpt-4-turbo | OpenAI | 128K | \u2705 |\n| gemini-pro | Google | 32K | \u2705 |\n\n## Examples\n\n### Code Generation\n\n```bash\ndurusai query \"Create a REST API endpoint for user registration with FastAPI\"\n```\n\n### Code Explanation\n\n```bash \ndurusai query \"Explain this Python code\" < my_script.py\n```\n\n### Interactive Debugging\n\n```bash\ndurusai chat\n# Then in chat:\n# User: I'm getting a TypeError in my Python code\n# AI: I'd be happy to help! Please share the error and the relevant code...\n```\n\n### Batch Processing\n\n```bash\n# Process multiple files\nfor file in *.py; do\n durusai query \"Review this code for bugs: $(cat $file)\" > \"${file%.py}_review.md\"\ndone\n```\n\n## Environment Variables\n\n- `DURUSAI_API_ENDPOINT` - API endpoint URL\n- `DURUSAI_API_TOKEN` - API authentication token \n- `DURUSAI_CONFIG_DIR` - Configuration directory (default: ~/.durusai)\n- `DURUSAI_PROFILE` - Default profile name\n\n## Security\n\n- **Tokens** are stored securely using system keyring (macOS Keychain, Windows Credential Manager, Linux Secret Service)\n- **Fallback encryption** using Fernet for systems without keyring\n- **No plaintext passwords** stored locally\n- **JWT tokens** with automatic refresh\n- **TLS/SSL** for all API communications\n\n## Development\n\n### Setup\n\n```bash\ngit clone https://github.com/durusai/cli.git\ncd cli/durusai_native_cli\npip install -e \".[dev]\"\n```\n\n### Testing\n\n```bash\npytest\n```\n\n### Code formatting\n\n```bash\nblack durusai/\nflake8 durusai/\nmypy durusai/\n```\n\n### Building\n\n```bash\npython -m build\n```\n\n## Contributing\n\n1. Fork the repository\n2. Create a feature branch\n3. Make your changes\n4. Add tests\n5. Run the test suite\n6. Create a pull request\n\n## License\n\nMIT License - see [LICENSE](LICENSE) file.\n\n## Support\n\n- \ud83d\udce7 Email: support@durusai.com\n- \ud83d\udcac GitHub Issues: [durusai/cli/issues](https://github.com/durusai/cli/issues)\n- \ud83d\udcd6 Documentation: [docs.durusai.com](https://docs.durusai.com)\n\n## Changelog\n\nSee [CHANGELOG.md](CHANGELOG.md) for release history.\n\n---\n\nMade with \u2764\ufe0f by the DurusAI team\n",
"bugtrack_url": null,
"license": null,
"summary": "DurusAI CLI - Advanced AI assistant with memory, file operations, and project management like Claude Code",
"version": "2.4.0",
"project_urls": {
"Bug Tracker": "https://github.com/durusai/cli/issues",
"Changelog": "https://github.com/durusai/cli/blob/main/CHANGELOG.md",
"Documentation": "https://docs.durusai.com",
"Homepage": "https://durusai.com",
"Repository": "https://github.com/durusai/cli"
},
"split_keywords": [
"ai",
" cli",
" assistant",
" development",
" api",
" anthropic",
" openai",
" terminal"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "b70f405ea81302d97685d3cd43b61d9ad6f69aacff86f9dfffa524525cd0873e",
"md5": "8590b82b589b8355b5e7895bdca22694",
"sha256": "be792c1fe5fe0e45d1f6e93f0f84787f2bd7d9942299fff71fe0b39e6e0e2339"
},
"downloads": -1,
"filename": "durusai_cli-2.4.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8590b82b589b8355b5e7895bdca22694",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 33621,
"upload_time": "2025-08-24T11:22:19",
"upload_time_iso_8601": "2025-08-24T11:22:19.294426Z",
"url": "https://files.pythonhosted.org/packages/b7/0f/405ea81302d97685d3cd43b61d9ad6f69aacff86f9dfffa524525cd0873e/durusai_cli-2.4.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "1f5ced55925f71a50dbf6c605e86cbbe156c0e0034a8ca025551293b6c3b231a",
"md5": "9a10f2d49499718e28df20b7099d8d73",
"sha256": "0d64c61adbc5eaa671ecd99876344d7e49c3fb5aced1bc619fce9268f89656b6"
},
"downloads": -1,
"filename": "durusai_cli-2.4.0.tar.gz",
"has_sig": false,
"md5_digest": "9a10f2d49499718e28df20b7099d8d73",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 37479,
"upload_time": "2025-08-24T11:22:20",
"upload_time_iso_8601": "2025-08-24T11:22:20.830384Z",
"url": "https://files.pythonhosted.org/packages/1f/5c/ed55925f71a50dbf6c605e86cbbe156c0e0034a8ca025551293b6c3b231a/durusai_cli-2.4.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-24 11:22:20",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "durusai",
"github_project": "cli",
"github_not_found": true,
"lcname": "durusai-cli"
}