Name | cmai JSON |
Version |
0.1.3
JSON |
| download |
home_page | None |
Summary | AI Powered Commit Message Normalization Tool |
upload_time | 2025-07-16 09:18:39 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | AGPL-3.0 |
keywords |
ai
commit
git
cli
normalization
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# CMAI - AI-Powered Commit Message Normalizer
[](https://www.gnu.org/licenses/agpl-3.0)
[](https://www.python.org/downloads/)
CMAI is an intelligent command-line tool that leverages AI to transform informal or colloquial commit messages into standardized, professional Git commit messages. It analyzes your staged changes and uses advanced language models to generate clear, concise, and conventional commit messages.
## 🌟 Features
- **AI-Powered Normalization**: Converts informal commit descriptions into professional, standardized commit messages
- **Git Integration**: Automatically analyzes staged changes to provide context-aware suggestions
- **Multiple AI Provider Support**: Supports various AI providers including OpenAI-compatible APIs, Bailian (Qwen), DeepSeek, SiliconFlow, and local Ollama models
- **Extensible Architecture**: Easy to add new AI providers through the provider factory system
- **Configurable**: Customizable prompt templates and model settings
- **Token Usage Tracking**: Monitor AI token consumption for cost management
- **Comprehensive Logging**: Detailed logging for debugging and monitoring with stream output support
## 🚀 Quick Start
### Installation
Install CMAI using pip:
```bash
pip install cmai
```
Or install from source:
```bash
git clone https://github.com/yumuzhihan/cmai.git
cd cmai
pip install -e .
```
### Quick Setup
1. **Configure your AI provider** (this step is **required**):
Create `~/.config/cmai/settings.env`:
```env
# Example: Using OpenAI
PROVIDER=openai
API_KEY=your_openai_api_key
MODEL=gpt-4o-mini
# Or using Ollama locally
# PROVIDER=ollama
# OLLAMA_HOST=http://localhost:11434
# MODEL=qwen2.5:7b
```
2. **Set your API key** (if using remote providers):
```bash
# For OpenAI
export OPENAI_API_KEY=your_api_key
# For Bailian
export DASHSCOPE_API_KEY=your_api_key
# For DeepSeek
export DEEPSEEK_API_KEY=your_api_key
```
3. **Test the installation**:
```bash
# Stage some changes
git add .
# Generate a commit message
cmai "fixed a bug"
```
### Basic Usage
1. Stage your changes in Git:
```bash
git add .
```
2. Use CMAI to generate a normalized commit message:
```bash
cmai "fix some bugs in user authentication"
```
3. The tool will output a standardized commit message like:
```text
Commit message: Fix authentication bugs in user login module
Tokens used: 45
```
⚠️ **Important Setup Reminder**: Before using CMAI, you must configure at least two settings:
- `PROVIDER`: The AI provider you want to use (e.g., `openai`, `bailian`, `ollama`)
- `MODEL`: The specific model name (this is **required** for all providers)
Without these configurations, CMAI will fail to run. See the [Configuration](#-configuration) section for details.
## 🔧 Configuration
CMAI uses a configuration file located at `~/.config/cmai/settings.env`. The configuration file will be automatically created on first run.
### Environment Variables
Create or edit `~/.config/cmai/settings.env`:
```env
# AI Provider Configuration
PROVIDER=openai
API_BASE=https://api.openai.com/v1
API_KEY=your_api_key_here
MODEL=gpt-4o-mini
# For Bailian (Qwen) API
# PROVIDER=bailian
# API_BASE=https://dashscope.aliyuncs.com/compatible-mode/v1
# API_KEY=your_dashscope_api_key
# MODEL=qwen-turbo-latest
# For DeepSeek API
# PROVIDER=deepseek
# API_BASE=https://api.deepseek.com/v1
# API_KEY=your_deepseek_api_key
# MODEL=deepseek-chat
# For SiliconFlow API
# PROVIDER=siliconflow
# API_BASE=https://api.siliconflow.cn/v1
# API_KEY=your_siliconflow_api_key
# MODEL=Qwen/Qwen2.5-7B-Instruct
# For Ollama (local)
# PROVIDER=ollama
# OLLAMA_HOST=http://localhost:11434
# MODEL=qwen2.5:7b
# Logging Configuration
LOG_LEVEL=INFO
LOG_FILE_PATH=/path/to/logfile.log
# Prompt Template (optional customization)
PROMPT_TEMPLATE=Please generate a standardized commit message based on the user description: {user_input}. The changes include: {diff_content}. Respond only with the normalized commit message in English.
```
### Supported AI Providers
CMAI supports multiple AI providers through its extensible architecture:
#### 1. OpenAI-Compatible APIs
- **OpenAI**: Official OpenAI API
- **Bailian (Qwen)**: Alibaba Cloud's Qwen models
- **DeepSeek**: DeepSeek's AI models
- **SiliconFlow**: SiliconFlow's AI services
- **ChatGPT**: OpenAI ChatGPT models
#### 2. Local Models
- **Ollama**: Run models locally using Ollama
### Important: Model Configuration
⚠️ **Important**: You must specify a `MODEL` in your configuration. CMAI requires an explicit model name to function properly. Examples:
- OpenAI: `gpt-4o-mini`, `gpt-4o`, `gpt-3.5-turbo`
- Bailian: `qwen-turbo-latest`, `qwen-plus-latest`, `qwen-max-latest`
- DeepSeek: `deepseek-chat`, `deepseek-coder`
- SiliconFlow: `Qwen/Qwen2.5-7B-Instruct`, `deepseek-ai/DeepSeek-V2.5`
- Ollama: `qwen2.5:7b`, `llama3.1:8b`, `codellama:7b`
### API Key Setup
Different providers require different API keys:
1. **OpenAI**: Set `OPENAI_API_KEY` or `CMAI_API_KEY` environment variable
```bash
export OPENAI_API_KEY=your_openai_api_key
```
2. **Bailian (Qwen)**: Set `DASHSCOPE_API_KEY` environment variable
```bash
export DASHSCOPE_API_KEY=your_dashscope_api_key
```
3. **DeepSeek**: Set `DEEPSEEK_API_KEY` or `CMAI_API_KEY` environment variable
```bash
export DEEPSEEK_API_KEY=your_deepseek_api_key
```
4. **SiliconFlow**: Set `SILICONFLOW_API_KEY` or `CMAI_API_KEY` environment variable
```bash
export SILICONFLOW_API_KEY=your_siliconflow_api_key
```
5. **Ollama**: No API key required, but ensure Ollama is running locally
6. **Configuration file**: Add `API_KEY=your_api_key_here` to `~/.config/cmai/settings.env`
7. **Custom config file**: Use the `--config` option to specify a different configuration file
## 📖 Usage Examples
### Usage Examples
```bash
# Simple commit message normalization (uses configuration file settings)
cmai "updated readme file"
# Using a custom configuration file
cmai "fixed authentication bug" --config /path/to/custom/config.env
# Specifying a different repository
cmai "refactored user service" --repo /path/to/repo
```
### Using Custom Configuration
```bash
# Use a specific configuration file
cmai "refactored auth system" --config /path/to/custom/config.env
```
### Specifying Repository Path
```bash
# Analyze changes in a specific repository
cmai "fixed login bug" --repo /path/to/your/repo
```
### Full Command Options
```bash
cmai [OPTIONS] MESSAGE
Arguments:
MESSAGE The informal commit message to be normalized [required]
Options:
-c, --config TEXT Path to configuration file
-r, --repo TEXT Git repository path
--help Show this message and exit
```
**Note**: Provider and model selection is configured through the configuration file or environment variables, not command-line arguments.
### Configuration Examples for Different Providers
To use different AI providers, configure your `~/.config/cmai/settings.env` file:
#### OpenAI
```env
PROVIDER=openai
API_BASE=https://api.openai.com/v1
API_KEY=your_openai_api_key
MODEL=gpt-4o-mini
```
#### Bailian (Qwen)
```env
PROVIDER=bailian
API_BASE=https://dashscope.aliyuncs.com/compatible-mode/v1
API_KEY=your_dashscope_api_key
MODEL=qwen-turbo-latest
```
#### DeepSeek
```env
PROVIDER=deepseek
API_BASE=https://api.deepseek.com/v1
API_KEY=your_deepseek_api_key
MODEL=deepseek-chat
```
#### SiliconFlow
```env
PROVIDER=siliconflow
API_BASE=https://api.siliconflow.cn/v1
API_KEY=your_siliconflow_api_key
MODEL=Qwen/Qwen2.5-7B-Instruct
```
#### Ollama (Local)
```env
PROVIDER=ollama
OLLAMA_HOST=http://localhost:11434
MODEL=qwen2.5:7b
```
After configuring your preferred provider, simply use:
```bash
cmai "your commit message"
```
## 🏗️ Architecture
CMAI follows a modular architecture with the following components:
### Core Components
- **`cmai.main`**: Entry point and CLI interface using Click
- **`cmai.core.normalizer`**: Core logic for commit message normalization
- **`cmai.core.get_logger`**: Logging factory and configuration with stream support
- **`cmai.config.settings`**: Configuration management using Pydantic
### Provider System
- **`cmai.providers.base`**: Abstract base class for AI providers
- **`cmai.providers.openai_provider`**: OpenAI-compatible API implementation
- **`cmai.providers.ollama_provider`**: Ollama local model implementation
- **`cmai.providers.provider_factory`**: Factory for creating and managing providers
- **`cmai.providers.bailian_provider`**: Legacy Bailian provider (deprecated)
### Utilities
- **`cmai.utils.git_staged_analyzer`**: Git repository analysis and diff extraction
### Data Models
```python
class AIResponse(BaseModel):
content: str # The normalized commit message
model: str # AI model used
provider: str # AI provider name
tokens_used: Optional[int] # Token consumption
```
### Provider Factory System
CMAI uses a factory pattern for managing AI providers:
```python
from cmai.providers.provider_factory import create_provider
# Create provider with default settings
provider = create_provider()
# Create specific provider with model
provider = create_provider("openai", model="gpt-4o-mini")
# Create Ollama provider
provider = create_provider("ollama", model="qwen2.5:7b")
```
## 🔌 Extending CMAI
### Adding New AI Providers
To add support for a new AI provider, create a class that inherits from `BaseAIClient` and register it with the provider factory:
```python
from cmai.providers.base import BaseAIClient, AIResponse
from cmai.providers.provider_factory import register_custom_provider
class CustomProvider(BaseAIClient):
async def normalize_commit(self, prompt: str, **kwargs) -> AIResponse:
# Implement your provider logic here
# Must return AIResponse with content, model, provider, and tokens_used
pass
def validate_config(self) -> bool:
# Implement configuration validation
return True
# Register the provider
register_custom_provider("custom", CustomProvider)
```
### Using the Provider Factory
The provider factory automatically manages different AI providers:
```python
from cmai.providers.provider_factory import (
create_provider,
list_available_providers,
register_custom_provider
)
# List all available providers
providers = list_available_providers()
print(providers)
# Output: {'openai': 'OpenaiProvider', 'ollama': 'OllamaProvider', ...}
# Create provider instances
openai_provider = create_provider("openai", model="gpt-4o-mini")
ollama_provider = create_provider("ollama", model="qwen2.5:7b")
```
### Custom Prompt Templates
You can customize the prompt template by modifying the `PROMPT_TEMPLATE` setting:
```env
PROMPT_TEMPLATE=Generate a standardized commit message based on: {user_input}. Changes: {diff_content}. Use conventional commit format.
```
Available placeholders:
- `{user_input}`: The user's informal commit message
- `{diff_content}`: Git diff information from staged changes
## 🛠️ Development
### Setting Up Development Environment
1. Clone the repository:
```bash
git clone https://github.com/yumuzhihan/cmai.git
cd cmai
```
2. Create a virtual environment:
```bash
uv venv
# Activate virtual environment
# On Unix/macOS:
source .venv/bin/activate
# On Windows:
# .venv\Scripts\activate
```
3. Install development dependencies:
```bash
# Install all dependencies including dev dependencies
uv sync --dev
# Or if you prefer to install only production dependencies:
uv sync
```
### Project Structure
```text
cmai/
├── cmai/
│ ├── __init__.py
│ ├── main.py # CLI entry point
│ ├── config/
│ │ ├── __init__.py
│ │ └── settings.py # Configuration management
│ ├── core/
│ │ ├── __init__.py
│ │ ├── get_logger.py # Logging utilities
│ │ └── normalizer.py # Core normalization logic
│ ├── providers/
│ │ ├── __init__.py
│ │ ├── base.py # Abstract provider interface
│ │ ├── openai_provider.py # OpenAI-compatible implementation
│ │ ├── ollama_provider.py # Ollama local model implementation
│ │ ├── provider_factory.py # Provider factory system
│ │ └── bailian_provider.py # Legacy Bailian provider
│ └── utils/
│ ├── __init__.py
│ └── git_staged_analyzer.py # Git utilities
├── tests/ # Test suite
├── docs/ # Documentation
├── scripts/ # Utility scripts
├── pyproject.toml # Project configuration
├── uv.lock # UV lock file
├── LICENSE # AGPL-3.0 License
└── README.md # This file
```
### Running Tests
```bash
python -m pytest tests/
```
## 📋 Requirements
- Python 3.10 or higher
- Git (for repository analysis)
- Internet connection (for remote AI provider APIs)
- Ollama installation (for local AI models)
### Dependencies
- `click>=8.2.1` - Command-line interface
- `openai>=1.91.0` - OpenAI-compatible API client
- `ollama>=0.5.1` - Ollama Python client for local models
- `pydantic>=2.11.7` - Data validation and settings
- `pydantic-settings>=2.10.0` - Settings management
## 🤝 Contributing
We welcome contributions! Please follow these steps:
1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Make your changes
4. Add tests for new functionality
5. Commit your changes (`git commit -m 'Add amazing feature'`)
6. Push to the branch (`git push origin feature/amazing-feature`)
7. Open a Pull Request
### Contribution Guidelines
- Follow PEP 8 coding standards
- Add type hints to all functions
- Write comprehensive tests
- Update documentation for new features
- Ensure backward compatibility
- When adding new providers, register them in the provider factory
### Adding New Providers
When contributing new AI providers:
1. Create a new file in `cmai/providers/` (e.g., `custom_provider.py`)
2. Implement the `BaseAIClient` interface
3. Register the provider in `provider_factory.py`
4. Add configuration examples to the README
5. Include tests for the new provider
## 📄 License
This project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0). See the [LICENSE](LICENSE) file for details.
## 🙏 Acknowledgments
- Built with [Click](https://click.palletsprojects.com/) for the CLI interface
- Uses [Pydantic](https://pydantic.dev/) for configuration and data validation
- Supports multiple AI providers including [OpenAI](https://openai.com/), [Bailian](https://bailian.console.aliyun.com/), [DeepSeek](https://www.deepseek.com/), [SiliconFlow](https://siliconflow.cn/), and [Ollama](https://ollama.com/)
- Inspired by conventional commit standards
- Uses [UV](https://docs.astral.sh/uv/) for fast Python package management
## 📞 Support
If you encounter any issues or have questions:
1. Check the [Issues](https://github.com/yumuzhihan/cmai/issues) page
2. Create a new issue with detailed information about your setup (provider, model, configuration)
3. Review the documentation and configuration guide
4. For provider-specific issues, include your provider and model information
### Common Issues
- **"Model must be specified"**: Ensure you have set the `MODEL` in your configuration file or passed it as a command-line argument
- **API key errors**: Verify that your API key is correctly set for your chosen provider
- **Connection errors**: Check your internet connection and API endpoint URLs
- **Ollama connection issues**: Ensure Ollama is running locally and accessible at the configured host
---
**Note**: This tool requires access to AI language models. Please ensure you have appropriate API keys and understand the associated costs before using CMAI in production environments. For local usage, consider using Ollama with open-source models.
Raw data
{
"_id": null,
"home_page": null,
"name": "cmai",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "yumuzhihan <1573252900@qq.com>",
"keywords": "ai, commit, git, cli, normalization",
"author": null,
"author_email": "yumuzhihan <1573252900@qq.com>",
"download_url": "https://files.pythonhosted.org/packages/3a/de/351a81c35213bcd8a7b6905c002be057b23f8019c4bff8fb704d8761380c/cmai-0.1.3.tar.gz",
"platform": null,
"description": "# CMAI - AI-Powered Commit Message Normalizer\n\n[](https://www.gnu.org/licenses/agpl-3.0)\n[](https://www.python.org/downloads/)\n\nCMAI is an intelligent command-line tool that leverages AI to transform informal or colloquial commit messages into standardized, professional Git commit messages. It analyzes your staged changes and uses advanced language models to generate clear, concise, and conventional commit messages.\n\n## \ud83c\udf1f Features\n\n- **AI-Powered Normalization**: Converts informal commit descriptions into professional, standardized commit messages\n- **Git Integration**: Automatically analyzes staged changes to provide context-aware suggestions\n- **Multiple AI Provider Support**: Supports various AI providers including OpenAI-compatible APIs, Bailian (Qwen), DeepSeek, SiliconFlow, and local Ollama models\n- **Extensible Architecture**: Easy to add new AI providers through the provider factory system\n- **Configurable**: Customizable prompt templates and model settings\n- **Token Usage Tracking**: Monitor AI token consumption for cost management\n- **Comprehensive Logging**: Detailed logging for debugging and monitoring with stream output support\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\nInstall CMAI using pip:\n\n```bash\npip install cmai\n```\n\nOr install from source:\n\n```bash\ngit clone https://github.com/yumuzhihan/cmai.git\ncd cmai\npip install -e .\n```\n\n### Quick Setup\n\n1. **Configure your AI provider** (this step is **required**):\n\nCreate `~/.config/cmai/settings.env`:\n\n```env\n# Example: Using OpenAI\nPROVIDER=openai\nAPI_KEY=your_openai_api_key\nMODEL=gpt-4o-mini\n\n# Or using Ollama locally\n# PROVIDER=ollama\n# OLLAMA_HOST=http://localhost:11434\n# MODEL=qwen2.5:7b\n```\n\n2. **Set your API key** (if using remote providers):\n\n```bash\n# For OpenAI\nexport OPENAI_API_KEY=your_api_key\n\n# For Bailian\nexport DASHSCOPE_API_KEY=your_api_key\n\n# For DeepSeek\nexport DEEPSEEK_API_KEY=your_api_key\n```\n\n3. **Test the installation**:\n\n```bash\n# Stage some changes\ngit add .\n\n# Generate a commit message\ncmai \"fixed a bug\"\n```\n\n### Basic Usage\n\n1. Stage your changes in Git:\n\n```bash\ngit add .\n```\n\n2. Use CMAI to generate a normalized commit message:\n\n```bash\ncmai \"fix some bugs in user authentication\"\n```\n\n3. The tool will output a standardized commit message like:\n\n```text\nCommit message: Fix authentication bugs in user login module\nTokens used: 45\n```\n\n\u26a0\ufe0f **Important Setup Reminder**: Before using CMAI, you must configure at least two settings:\n\n- `PROVIDER`: The AI provider you want to use (e.g., `openai`, `bailian`, `ollama`)\n- `MODEL`: The specific model name (this is **required** for all providers)\n\nWithout these configurations, CMAI will fail to run. See the [Configuration](#-configuration) section for details.\n\n## \ud83d\udd27 Configuration\n\nCMAI uses a configuration file located at `~/.config/cmai/settings.env`. The configuration file will be automatically created on first run.\n\n### Environment Variables\n\nCreate or edit `~/.config/cmai/settings.env`:\n\n```env\n# AI Provider Configuration\nPROVIDER=openai\nAPI_BASE=https://api.openai.com/v1\nAPI_KEY=your_api_key_here\nMODEL=gpt-4o-mini\n\n# For Bailian (Qwen) API\n# PROVIDER=bailian\n# API_BASE=https://dashscope.aliyuncs.com/compatible-mode/v1\n# API_KEY=your_dashscope_api_key\n# MODEL=qwen-turbo-latest\n\n# For DeepSeek API\n# PROVIDER=deepseek\n# API_BASE=https://api.deepseek.com/v1\n# API_KEY=your_deepseek_api_key\n# MODEL=deepseek-chat\n\n# For SiliconFlow API\n# PROVIDER=siliconflow\n# API_BASE=https://api.siliconflow.cn/v1\n# API_KEY=your_siliconflow_api_key\n# MODEL=Qwen/Qwen2.5-7B-Instruct\n\n# For Ollama (local)\n# PROVIDER=ollama\n# OLLAMA_HOST=http://localhost:11434\n# MODEL=qwen2.5:7b\n\n# Logging Configuration\nLOG_LEVEL=INFO\nLOG_FILE_PATH=/path/to/logfile.log\n\n# Prompt Template (optional customization)\nPROMPT_TEMPLATE=Please generate a standardized commit message based on the user description: {user_input}. The changes include: {diff_content}. Respond only with the normalized commit message in English.\n```\n\n### Supported AI Providers\n\nCMAI supports multiple AI providers through its extensible architecture:\n\n#### 1. OpenAI-Compatible APIs\n\n- **OpenAI**: Official OpenAI API\n- **Bailian (Qwen)**: Alibaba Cloud's Qwen models\n- **DeepSeek**: DeepSeek's AI models\n- **SiliconFlow**: SiliconFlow's AI services\n- **ChatGPT**: OpenAI ChatGPT models\n\n#### 2. Local Models\n\n- **Ollama**: Run models locally using Ollama\n\n### Important: Model Configuration\n\n\u26a0\ufe0f **Important**: You must specify a `MODEL` in your configuration. CMAI requires an explicit model name to function properly. Examples:\n\n- OpenAI: `gpt-4o-mini`, `gpt-4o`, `gpt-3.5-turbo`\n- Bailian: `qwen-turbo-latest`, `qwen-plus-latest`, `qwen-max-latest`\n- DeepSeek: `deepseek-chat`, `deepseek-coder`\n- SiliconFlow: `Qwen/Qwen2.5-7B-Instruct`, `deepseek-ai/DeepSeek-V2.5`\n- Ollama: `qwen2.5:7b`, `llama3.1:8b`, `codellama:7b`\n\n### API Key Setup\n\nDifferent providers require different API keys:\n\n1. **OpenAI**: Set `OPENAI_API_KEY` or `CMAI_API_KEY` environment variable\n\n```bash\nexport OPENAI_API_KEY=your_openai_api_key\n```\n\n2. **Bailian (Qwen)**: Set `DASHSCOPE_API_KEY` environment variable\n\n```bash\nexport DASHSCOPE_API_KEY=your_dashscope_api_key\n```\n\n3. **DeepSeek**: Set `DEEPSEEK_API_KEY` or `CMAI_API_KEY` environment variable\n\n```bash\nexport DEEPSEEK_API_KEY=your_deepseek_api_key\n```\n\n4. **SiliconFlow**: Set `SILICONFLOW_API_KEY` or `CMAI_API_KEY` environment variable\n\n```bash\nexport SILICONFLOW_API_KEY=your_siliconflow_api_key\n```\n\n5. **Ollama**: No API key required, but ensure Ollama is running locally\n\n6. **Configuration file**: Add `API_KEY=your_api_key_here` to `~/.config/cmai/settings.env`\n\n7. **Custom config file**: Use the `--config` option to specify a different configuration file\n\n## \ud83d\udcd6 Usage Examples\n\n### Usage Examples\n\n```bash\n# Simple commit message normalization (uses configuration file settings)\ncmai \"updated readme file\"\n\n# Using a custom configuration file\ncmai \"fixed authentication bug\" --config /path/to/custom/config.env\n\n# Specifying a different repository\ncmai \"refactored user service\" --repo /path/to/repo\n```\n\n### Using Custom Configuration\n\n```bash\n# Use a specific configuration file\ncmai \"refactored auth system\" --config /path/to/custom/config.env\n```\n\n### Specifying Repository Path\n\n```bash\n# Analyze changes in a specific repository\ncmai \"fixed login bug\" --repo /path/to/your/repo\n```\n\n### Full Command Options\n\n```bash\ncmai [OPTIONS] MESSAGE\n\nArguments:\n MESSAGE The informal commit message to be normalized [required]\n\nOptions:\n -c, --config TEXT Path to configuration file\n -r, --repo TEXT Git repository path\n --help Show this message and exit\n```\n\n**Note**: Provider and model selection is configured through the configuration file or environment variables, not command-line arguments.\n\n### Configuration Examples for Different Providers\n\nTo use different AI providers, configure your `~/.config/cmai/settings.env` file:\n\n#### OpenAI\n\n```env\nPROVIDER=openai\nAPI_BASE=https://api.openai.com/v1\nAPI_KEY=your_openai_api_key\nMODEL=gpt-4o-mini\n```\n\n#### Bailian (Qwen)\n\n```env\nPROVIDER=bailian\nAPI_BASE=https://dashscope.aliyuncs.com/compatible-mode/v1\nAPI_KEY=your_dashscope_api_key\nMODEL=qwen-turbo-latest\n```\n\n#### DeepSeek\n\n```env\nPROVIDER=deepseek\nAPI_BASE=https://api.deepseek.com/v1\nAPI_KEY=your_deepseek_api_key\nMODEL=deepseek-chat\n```\n\n#### SiliconFlow\n\n```env\nPROVIDER=siliconflow\nAPI_BASE=https://api.siliconflow.cn/v1\nAPI_KEY=your_siliconflow_api_key\nMODEL=Qwen/Qwen2.5-7B-Instruct\n```\n\n#### Ollama (Local)\n\n```env\nPROVIDER=ollama\nOLLAMA_HOST=http://localhost:11434\nMODEL=qwen2.5:7b\n```\n\nAfter configuring your preferred provider, simply use:\n\n```bash\ncmai \"your commit message\"\n```\n\n## \ud83c\udfd7\ufe0f Architecture\n\nCMAI follows a modular architecture with the following components:\n\n### Core Components\n\n- **`cmai.main`**: Entry point and CLI interface using Click\n- **`cmai.core.normalizer`**: Core logic for commit message normalization\n- **`cmai.core.get_logger`**: Logging factory and configuration with stream support\n- **`cmai.config.settings`**: Configuration management using Pydantic\n\n### Provider System\n\n- **`cmai.providers.base`**: Abstract base class for AI providers\n- **`cmai.providers.openai_provider`**: OpenAI-compatible API implementation\n- **`cmai.providers.ollama_provider`**: Ollama local model implementation\n- **`cmai.providers.provider_factory`**: Factory for creating and managing providers\n- **`cmai.providers.bailian_provider`**: Legacy Bailian provider (deprecated)\n\n### Utilities\n\n- **`cmai.utils.git_staged_analyzer`**: Git repository analysis and diff extraction\n\n### Data Models\n\n```python\nclass AIResponse(BaseModel):\n content: str # The normalized commit message\n model: str # AI model used\n provider: str # AI provider name\n tokens_used: Optional[int] # Token consumption\n```\n\n### Provider Factory System\n\nCMAI uses a factory pattern for managing AI providers:\n\n```python\nfrom cmai.providers.provider_factory import create_provider\n\n# Create provider with default settings\nprovider = create_provider()\n\n# Create specific provider with model\nprovider = create_provider(\"openai\", model=\"gpt-4o-mini\")\n\n# Create Ollama provider\nprovider = create_provider(\"ollama\", model=\"qwen2.5:7b\")\n```\n\n## \ud83d\udd0c Extending CMAI\n\n### Adding New AI Providers\n\nTo add support for a new AI provider, create a class that inherits from `BaseAIClient` and register it with the provider factory:\n\n```python\nfrom cmai.providers.base import BaseAIClient, AIResponse\nfrom cmai.providers.provider_factory import register_custom_provider\n\nclass CustomProvider(BaseAIClient):\n async def normalize_commit(self, prompt: str, **kwargs) -> AIResponse:\n # Implement your provider logic here\n # Must return AIResponse with content, model, provider, and tokens_used\n pass\n \n def validate_config(self) -> bool:\n # Implement configuration validation\n return True\n\n# Register the provider\nregister_custom_provider(\"custom\", CustomProvider)\n```\n\n### Using the Provider Factory\n\nThe provider factory automatically manages different AI providers:\n\n```python\nfrom cmai.providers.provider_factory import (\n create_provider,\n list_available_providers,\n register_custom_provider\n)\n\n# List all available providers\nproviders = list_available_providers()\nprint(providers)\n# Output: {'openai': 'OpenaiProvider', 'ollama': 'OllamaProvider', ...}\n\n# Create provider instances\nopenai_provider = create_provider(\"openai\", model=\"gpt-4o-mini\")\nollama_provider = create_provider(\"ollama\", model=\"qwen2.5:7b\")\n```\n\n### Custom Prompt Templates\n\nYou can customize the prompt template by modifying the `PROMPT_TEMPLATE` setting:\n\n```env\nPROMPT_TEMPLATE=Generate a standardized commit message based on: {user_input}. Changes: {diff_content}. Use conventional commit format.\n```\n\nAvailable placeholders:\n\n- `{user_input}`: The user's informal commit message\n- `{diff_content}`: Git diff information from staged changes\n\n## \ud83d\udee0\ufe0f Development\n\n### Setting Up Development Environment\n\n1. Clone the repository:\n\n```bash\ngit clone https://github.com/yumuzhihan/cmai.git\ncd cmai\n```\n\n2. Create a virtual environment:\n\n```bash\nuv venv\n\n# Activate virtual environment\n# On Unix/macOS:\nsource .venv/bin/activate\n# On Windows:\n# .venv\\Scripts\\activate\n```\n\n3. Install development dependencies:\n\n```bash\n# Install all dependencies including dev dependencies\nuv sync --dev\n\n# Or if you prefer to install only production dependencies:\nuv sync\n```\n\n### Project Structure\n\n```text\ncmai/\n\u251c\u2500\u2500 cmai/\n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u251c\u2500\u2500 main.py # CLI entry point\n\u2502 \u251c\u2500\u2500 config/\n\u2502 \u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u2502 \u2514\u2500\u2500 settings.py # Configuration management\n\u2502 \u251c\u2500\u2500 core/\n\u2502 \u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u2502 \u251c\u2500\u2500 get_logger.py # Logging utilities\n\u2502 \u2502 \u2514\u2500\u2500 normalizer.py # Core normalization logic\n\u2502 \u251c\u2500\u2500 providers/\n\u2502 \u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u2502 \u251c\u2500\u2500 base.py # Abstract provider interface\n\u2502 \u2502 \u251c\u2500\u2500 openai_provider.py # OpenAI-compatible implementation\n\u2502 \u2502 \u251c\u2500\u2500 ollama_provider.py # Ollama local model implementation\n\u2502 \u2502 \u251c\u2500\u2500 provider_factory.py # Provider factory system\n\u2502 \u2502 \u2514\u2500\u2500 bailian_provider.py # Legacy Bailian provider\n\u2502 \u2514\u2500\u2500 utils/\n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u2514\u2500\u2500 git_staged_analyzer.py # Git utilities\n\u251c\u2500\u2500 tests/ # Test suite\n\u251c\u2500\u2500 docs/ # Documentation\n\u251c\u2500\u2500 scripts/ # Utility scripts\n\u251c\u2500\u2500 pyproject.toml # Project configuration\n\u251c\u2500\u2500 uv.lock # UV lock file\n\u251c\u2500\u2500 LICENSE # AGPL-3.0 License\n\u2514\u2500\u2500 README.md # This file\n```\n\n### Running Tests\n\n```bash\npython -m pytest tests/\n```\n\n## \ud83d\udccb Requirements\n\n- Python 3.10 or higher\n- Git (for repository analysis)\n- Internet connection (for remote AI provider APIs)\n- Ollama installation (for local AI models)\n\n### Dependencies\n\n- `click>=8.2.1` - Command-line interface\n- `openai>=1.91.0` - OpenAI-compatible API client\n- `ollama>=0.5.1` - Ollama Python client for local models\n- `pydantic>=2.11.7` - Data validation and settings\n- `pydantic-settings>=2.10.0` - Settings management\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! Please follow these steps:\n\n1. Fork the repository\n2. Create a feature branch (`git checkout -b feature/amazing-feature`)\n3. Make your changes\n4. Add tests for new functionality\n5. Commit your changes (`git commit -m 'Add amazing feature'`)\n6. Push to the branch (`git push origin feature/amazing-feature`)\n7. Open a Pull Request\n\n### Contribution Guidelines\n\n- Follow PEP 8 coding standards\n- Add type hints to all functions\n- Write comprehensive tests\n- Update documentation for new features\n- Ensure backward compatibility\n- When adding new providers, register them in the provider factory\n\n### Adding New Providers\n\nWhen contributing new AI providers:\n\n1. Create a new file in `cmai/providers/` (e.g., `custom_provider.py`)\n2. Implement the `BaseAIClient` interface\n3. Register the provider in `provider_factory.py`\n4. Add configuration examples to the README\n5. Include tests for the new provider\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0). See the [LICENSE](LICENSE) file for details.\n\n## \ud83d\ude4f Acknowledgments\n\n- Built with [Click](https://click.palletsprojects.com/) for the CLI interface\n- Uses [Pydantic](https://pydantic.dev/) for configuration and data validation\n- Supports multiple AI providers including [OpenAI](https://openai.com/), [Bailian](https://bailian.console.aliyun.com/), [DeepSeek](https://www.deepseek.com/), [SiliconFlow](https://siliconflow.cn/), and [Ollama](https://ollama.com/)\n- Inspired by conventional commit standards\n- Uses [UV](https://docs.astral.sh/uv/) for fast Python package management\n\n## \ud83d\udcde Support\n\nIf you encounter any issues or have questions:\n\n1. Check the [Issues](https://github.com/yumuzhihan/cmai/issues) page\n2. Create a new issue with detailed information about your setup (provider, model, configuration)\n3. Review the documentation and configuration guide\n4. For provider-specific issues, include your provider and model information\n\n### Common Issues\n\n- **\"Model must be specified\"**: Ensure you have set the `MODEL` in your configuration file or passed it as a command-line argument\n- **API key errors**: Verify that your API key is correctly set for your chosen provider\n- **Connection errors**: Check your internet connection and API endpoint URLs\n- **Ollama connection issues**: Ensure Ollama is running locally and accessible at the configured host\n\n---\n\n**Note**: This tool requires access to AI language models. Please ensure you have appropriate API keys and understand the associated costs before using CMAI in production environments. For local usage, consider using Ollama with open-source models.\n",
"bugtrack_url": null,
"license": "AGPL-3.0",
"summary": "AI Powered Commit Message Normalization Tool",
"version": "0.1.3",
"project_urls": {
"Bug Tracker": "https://github.com/yumuzhihan/cmai/issues",
"Documentation": "https://github.com/yumuzhihan/cmai#readme",
"Homepage": "https://github.com/yumuzhihan/cmai",
"Repository": "https://github.com/yumuzhihan/cmai"
},
"split_keywords": [
"ai",
" commit",
" git",
" cli",
" normalization"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "9a97f89721c6422db35493e72819d178be636e67487de6c98732e888d1e4735d",
"md5": "deb8899fae059ade31bcada1b86cb2fe",
"sha256": "c2fcf46ca07e78ba67a1152ed4400b2ddc3ed038da7ac44124c1fee961a23f49"
},
"downloads": -1,
"filename": "cmai-0.1.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "deb8899fae059ade31bcada1b86cb2fe",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 32546,
"upload_time": "2025-07-16T09:18:39",
"upload_time_iso_8601": "2025-07-16T09:18:39.081437Z",
"url": "https://files.pythonhosted.org/packages/9a/97/f89721c6422db35493e72819d178be636e67487de6c98732e888d1e4735d/cmai-0.1.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "3ade351a81c35213bcd8a7b6905c002be057b23f8019c4bff8fb704d8761380c",
"md5": "5f80eeaaadffe0424002d3f1dcc146d4",
"sha256": "11aec3b9775f49a943b64549dce33cf226d29377a09756cfa47ed6304c322ebb"
},
"downloads": -1,
"filename": "cmai-0.1.3.tar.gz",
"has_sig": false,
"md5_digest": "5f80eeaaadffe0424002d3f1dcc146d4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 34277,
"upload_time": "2025-07-16T09:18:39",
"upload_time_iso_8601": "2025-07-16T09:18:39.954282Z",
"url": "https://files.pythonhosted.org/packages/3a/de/351a81c35213bcd8a7b6905c002be057b23f8019c4bff8fb704d8761380c/cmai-0.1.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-16 09:18:39",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yumuzhihan",
"github_project": "cmai",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "cmai"
}