Name | koder JSON |
Version |
0.1.0
JSON |
| download |
home_page | None |
Summary | An intuitive AI coding assistant and interactive CLI tool that boosts developer productivity with intelligent automation and context-aware support. |
upload_time | 2025-07-28 14:31:05 |
maintainer | None |
docs_url | None |
author | Pengfei Ni |
requires_python | >=3.9 |
license | None |
keywords |
agent
agentic
ai
copilot
litellm
openai
vibe coding
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Koder
[](https://www.python.org/downloads/)
[](LICENSE)
[](https://github.com/psf/black)
[](https://github.com/astral-sh/ruff)
An intuitive AI coding assistant and interactive CLI tool that boosts developer productivity with intelligent automation and context-aware support.
## ๐ Why this project
Yet another vibe coding assistant that aims to provide:
- Universal AI provider support - Works with OpenAI, Claude, Gemini, and 100+ providers through intelligent auto-detection.
- Persistent context - Remembers your conversations across sessions with smart token management.
- Rich toolset - File operations, search, shell commands, and web access in one unified interface.
- Zero-config start - Just set your API key and go, with automatic model selection and streaming support.
- Session management - Organize work by project with isolated conversation histories.
## ๐ Requirements
- Python 3.9 or higher.
- API key and optional baseURL from OpenAI, Gemini, Anthropic or other AI providers.
## ๐ ๏ธ Installation
### Using uv (Recommended)
```sh
uv tool install koder
```
### Using pip
```bash
pip install koder
```
## ๐ค AI Provider Configuration
Koder automatically detects and uses the LLM provider from environment variables.
**Model Selection**
The `KODER_MODEL` environment variable controls which model to use:
```bash
# OpenAI models
export KODER_MODEL="gpt-4.1"
# Claude models (via LiteLLM)
export KODER_MODEL="claude-opus-4-20250514"
# Google models (via LiteLLM)
export KODER_MODEL="gemini/gemini-2.5-pro"
```
**AI Providers:**
<details>
<summary>OpenAI</summary>
```bash
# Required
export OPENAI_API_KEY="your-openai-api-key"
# Optional: Custom OpenAI-compatible endpoint
export OPENAI_BASE_URL="https://api.openai.com/v1" # Default
# Optional: Specify model (default: gpt-4.1)
export KODER_MODEL="gpt-4o"
```
</details>
<details>
<summary>Gemini</summary>
```bash
# Required
export GEMINI_API_KEY="your-openai-api-key"
# Specify model (default: gemini/gemini-2.5-pro)
export KODER_MODEL="gemini/gemini-2.5-pro"
```
</details>
<details>
<summary>Azure OpenAI</summary>
```bash
# Required
export AZURE_OPENAI_API_KEY="your-azure-api-key"
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com"
export KODER_MODEL="gpt-4o" # Your Azure deployment model
# Optional
export AZURE_OPENAI_DEPLOYMENT="your-deployment-name" # Defaults to KODER_MODEL
export AZURE_OPENAI_API_VERSION="2025-04-01-preview" # Default version
```
</details>
<details>
<summary>Other AI providers (via LiteLLM)</summary>
[LiteLLM](https://docs.litellm.ai/docs/providers) supports 100+ providers including Anthropic, Google, Cohere, Hugging Face, and more:
```bash
# Anthropic Claude
export ANTHROPIC_API_KEY="your-anthropic-key"
export KODER_MODEL="claude-opus-4-20250514"
# Google Vertex AI
export GOOGLE_APPLICATION_CREDENTIALS="your-sa-path.json"
export VERTEXAI_LOCATION="<your-region>"
export KODER_MODEL="vertex_ai/claude-sonnet-4@20250514"
# Custom OpenAI-compatible endpoints
export OPENAI_API_KEY="your-key"
export OPENAI_BASE_URL="https://your-custom-endpoint.com/v1"
export KODER_MODEL="openai/<your-model-name>"
```
</details>
## โก Quick Start
### Basic Usage
```bash
# Run in interactive mode
koder
# Execute a single prompt
koder -s my-project "Help me implement a new feature"
# Use a specific session
koder --session my-project "Your prompt here"
# Enable streaming mode
koder --stream "Your prompt here"
```
## ๐งช Development
### Setup Development Environment
```bash
# Clone and setup
git clone https://github.com/feiskyer/koder.git
cd koder
uv sync
uv run koder
```
### Code Quality
```bash
# Format code
black .
# Lint code
ruff check .
# Type checking
mypy .
```
## ๐ค Contributing
1. Fork the repository
2. Create a feature branch: `git checkout -b feature/your-feature`
3. Make your changes
4. Run formating and linting: `black . && ruff check .`
5. Commit your changes: `git commit -am 'Add your feature'`
6. Push to the branch: `git push origin feature/your-feature`
7. Submit a pull request
## ๐ License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "koder",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "agent, agentic, ai, copilot, litellm, openai, vibe coding",
"author": "Pengfei Ni",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/44/50/eeecec3974c22fc35404d700a6106f0d37cf9e5d0bed3ade6aceaed3cd38/koder-0.1.0.tar.gz",
"platform": null,
"description": "# Koder\n\n[](https://www.python.org/downloads/)\n[](LICENSE)\n[](https://github.com/psf/black)\n[](https://github.com/astral-sh/ruff)\n\nAn intuitive AI coding assistant and interactive CLI tool that boosts developer productivity with intelligent automation and context-aware support.\n\n## \ud83d\ude80 Why this project\n\nYet another vibe coding assistant that aims to provide:\n\n- Universal AI provider support - Works with OpenAI, Claude, Gemini, and 100+ providers through intelligent auto-detection.\n- Persistent context - Remembers your conversations across sessions with smart token management.\n- Rich toolset - File operations, search, shell commands, and web access in one unified interface.\n- Zero-config start - Just set your API key and go, with automatic model selection and streaming support.\n- Session management - Organize work by project with isolated conversation histories.\n\n## \ud83d\udccb Requirements\n\n- Python 3.9 or higher.\n- API key and optional baseURL from OpenAI, Gemini, Anthropic or other AI providers.\n\n## \ud83d\udee0\ufe0f Installation\n\n### Using uv (Recommended)\n\n```sh\nuv tool install koder\n```\n\n### Using pip\n\n```bash\npip install koder\n```\n\n## \ud83e\udd16 AI Provider Configuration\n\nKoder automatically detects and uses the LLM provider from environment variables.\n\n**Model Selection**\n\nThe `KODER_MODEL` environment variable controls which model to use:\n\n```bash\n# OpenAI models\nexport KODER_MODEL=\"gpt-4.1\"\n\n# Claude models (via LiteLLM)\nexport KODER_MODEL=\"claude-opus-4-20250514\"\n\n# Google models (via LiteLLM)\nexport KODER_MODEL=\"gemini/gemini-2.5-pro\"\n```\n\n**AI Providers:**\n\n<details>\n\n<summary>OpenAI</summary>\n\n```bash\n# Required\nexport OPENAI_API_KEY=\"your-openai-api-key\"\n\n# Optional: Custom OpenAI-compatible endpoint\nexport OPENAI_BASE_URL=\"https://api.openai.com/v1\" # Default\n\n# Optional: Specify model (default: gpt-4.1)\nexport KODER_MODEL=\"gpt-4o\"\n```\n\n</details>\n\n<details>\n\n<summary>Gemini</summary>\n\n```bash\n# Required\nexport GEMINI_API_KEY=\"your-openai-api-key\"\n\n# Specify model (default: gemini/gemini-2.5-pro)\nexport KODER_MODEL=\"gemini/gemini-2.5-pro\"\n```\n\n</details>\n\n<details>\n\n<summary>Azure OpenAI</summary>\n\n```bash\n# Required\nexport AZURE_OPENAI_API_KEY=\"your-azure-api-key\"\nexport AZURE_OPENAI_ENDPOINT=\"https://your-resource.openai.azure.com\"\nexport KODER_MODEL=\"gpt-4o\" # Your Azure deployment model\n\n# Optional\nexport AZURE_OPENAI_DEPLOYMENT=\"your-deployment-name\" # Defaults to KODER_MODEL\nexport AZURE_OPENAI_API_VERSION=\"2025-04-01-preview\" # Default version\n```\n\n</details>\n\n<details>\n\n<summary>Other AI providers (via LiteLLM)</summary>\n\n[LiteLLM](https://docs.litellm.ai/docs/providers) supports 100+ providers including Anthropic, Google, Cohere, Hugging Face, and more:\n\n```bash\n# Anthropic Claude\nexport ANTHROPIC_API_KEY=\"your-anthropic-key\"\nexport KODER_MODEL=\"claude-opus-4-20250514\"\n\n# Google Vertex AI\nexport GOOGLE_APPLICATION_CREDENTIALS=\"your-sa-path.json\"\nexport VERTEXAI_LOCATION=\"<your-region>\"\nexport KODER_MODEL=\"vertex_ai/claude-sonnet-4@20250514\"\n\n# Custom OpenAI-compatible endpoints\nexport OPENAI_API_KEY=\"your-key\"\nexport OPENAI_BASE_URL=\"https://your-custom-endpoint.com/v1\"\nexport KODER_MODEL=\"openai/<your-model-name>\"\n```\n\n</details>\n\n\n\n## \u26a1 Quick Start\n\n### Basic Usage\n\n```bash\n# Run in interactive mode\nkoder\n\n# Execute a single prompt\nkoder -s my-project \"Help me implement a new feature\"\n\n# Use a specific session\nkoder --session my-project \"Your prompt here\"\n\n# Enable streaming mode\nkoder --stream \"Your prompt here\"\n```\n\n\n\n## \ud83e\uddea Development\n\n### Setup Development Environment\n\n```bash\n# Clone and setup\ngit clone https://github.com/feiskyer/koder.git\ncd koder\nuv sync\n\nuv run koder\n```\n\n### Code Quality\n\n```bash\n# Format code\nblack .\n\n# Lint code\nruff check .\n\n# Type checking\nmypy .\n```\n\n## \ud83e\udd1d Contributing\n\n1. Fork the repository\n2. Create a feature branch: `git checkout -b feature/your-feature`\n3. Make your changes\n4. Run formating and linting: `black . && ruff check .`\n5. Commit your changes: `git commit -am 'Add your feature'`\n6. Push to the branch: `git push origin feature/your-feature`\n7. Submit a pull request\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n",
"bugtrack_url": null,
"license": null,
"summary": "An intuitive AI coding assistant and interactive CLI tool that boosts developer productivity with intelligent automation and context-aware support.",
"version": "0.1.0",
"project_urls": null,
"split_keywords": [
"agent",
" agentic",
" ai",
" copilot",
" litellm",
" openai",
" vibe coding"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "abecc94a42f4b26ea36f94af655c0d924e7664d866addd43c4604c7be3dbfc17",
"md5": "74de769ea2715e97a5e6f8b6fb647233",
"sha256": "d761e895e8ec378a9b0078dc2947e8c11307777f1bd3634035c60771d96af051"
},
"downloads": -1,
"filename": "koder-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "74de769ea2715e97a5e6f8b6fb647233",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 44580,
"upload_time": "2025-07-28T14:31:02",
"upload_time_iso_8601": "2025-07-28T14:31:02.483105Z",
"url": "https://files.pythonhosted.org/packages/ab/ec/c94a42f4b26ea36f94af655c0d924e7664d866addd43c4604c7be3dbfc17/koder-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "4450eeecec3974c22fc35404d700a6106f0d37cf9e5d0bed3ade6aceaed3cd38",
"md5": "94a97baab86bde008f28ca5d32d88764",
"sha256": "c2eeb9aa3d2c48e8945d26e41691ca9e3867a0541c8919b6acc4c8546a3539a5"
},
"downloads": -1,
"filename": "koder-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "94a97baab86bde008f28ca5d32d88764",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 211239,
"upload_time": "2025-07-28T14:31:05",
"upload_time_iso_8601": "2025-07-28T14:31:05.089782Z",
"url": "https://files.pythonhosted.org/packages/44/50/eeecec3974c22fc35404d700a6106f0d37cf9e5d0bed3ade6aceaed3cd38/koder-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-28 14:31:05",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "koder"
}