# đ¤ commity
[](https://pypi.org/project/commity)
[](https://pypi.org/project/commity)
[](https://github.com/freboe/commity/blob/main/LICENSE)
[](https://github.com/freboe/commity/blob/main/README.md) | [](https://github.com/freboe/commity/blob/main/README.zh.md)
Generate intelligent Git commit messages with AI. Supports Conventional Commits, emoji, and multiple LLM providers like OpenAI, Ollama, and Gemini.
## đ¤ What is Commity?
**Commity** is an open-source, AI-powered Git commit message generation tool. It analyzes your staged code changes and automatically generates commit messages that follow the [**Conventional Commits**](https://www.conventionalcommits.org/) specification, and can even add emojis for you!
With a simple `commity --emoji` command, you can get a professional and clear commit message like this:
```
feat(api): ⨠add user authentication endpoint
```
## đ§ Installation
Install with `pip`:
```bash
pip install commity
```
Or install with `uv`:
```bash
uv tool install commity
```
## âī¸ Configuration
`commity` supports three configuration methods, with the following priority: **Command-line Arguments > Environment
Variables > Configuration File**.
Supported model providers are: `Gemini` (default), `Ollama`, `OpenAI`, `OpenRouter`.
### ⨠Method 1: Specify Model Parameters via Command-line
#### OpenAI
```Bash
commity --provider openai --model gpt-3.5-turbo --api_key <your-api-key>
```
#### Ollama
```Bash
commity --provider ollama --model llama2 --base_url http://localhost:11434
```
#### Gemini
```Bash
commity --provider gemini --model gemini-2.5-flash --base_url https://generativelanguage.googleapis.com --api_key <your-api-key> --timeout 30
```
or
```Bash
commity \
--provider gemini \
--model gemini-2.5-flash \
--base_url https://generativelanguage.googleapis.com \
--api_key <your-api-key> \
--timeout 30 \
```
#### OpenRouter
```Bash
commity --provider openrouter --model openai/gpt-3.5-turbo --api_key <your-openrouter-api-key>
```
or
```Bash
commity \
--provider openrouter \
--model anthropic/claude-3.5-sonnet \
--api_key <your-openrouter-api-key> \
```
### đą Method 2: Set Environment Variables as Defaults
You can add the following to your `.bashrc`, `.zshrc`, or `.env` file:
#### OpenAI
```Bash
export COMMITY_PROVIDER=openai
export COMMITY_MODEL=gpt-3.5-turbo
export COMMITY_API_KEY=your-api-key
```
#### Ollama
```Bash
export COMMITY_PROVIDER=ollama
export COMMITY_MODEL=llama2
export COMMITY_BASE_URL=http://localhost:11434
```
#### Gemini
```Bash
export COMMITY_PROVIDER=gemini
export COMMITY_MODEL=gemini-2.5-flash
export COMMITY_BASE_URL=https://generativelanguage.googleapis.com
export COMMITY_API_KEY=your-api-key
export COMMITY_TEMPERATURE=0.5
```
#### OpenRouter
```Bash
export COMMITY_PROVIDER=openrouter
export COMMITY_MODEL=openai/gpt-3.5-turbo
export COMMITY_API_KEY=your-openrouter-api-key
export COMMITY_TEMPERATURE=0.5
```
### đ Method 3: Use a Configuration File (Recommended)
For easier configuration management, you can create a `~/.commity/config.json` file in your user's home directory.
1. Create the directory:
```bash
mkdir -p ~/.commity
```
2. Create and edit the `config.json` file:
```bash
touch ~/.commity/config.json
```
3. Add your configuration to `config.json`, for example:
```json
{
"PROVIDER": "ollama",
"MODEL": "llama3",
"BASE_URL": "http://localhost:11434"
}
```
Or using Gemini:
```json
{
"PROVIDER": "gemini",
"MODEL": "gemini-1.5-flash",
"BASE_URL": "https://generativelanguage.googleapis.com",
"API_KEY": "your-gemini-api-key"
}
```
Or using OpenAI:
```json
{
"PROVIDER": "openai",
"MODEL": "gpt-3.5-turbo",
"API_KEY": "your-openai-api-key"
}
```
Or using OpenRouter:
```json
{
"PROVIDER": "openrouter",
"MODEL": "openai/gpt-3.5-turbo",
"API_KEY": "your-openrouter-api-key"
}
```
## đ Usage
```Bash
# Run
commity
# View help
commity --help
# Use Chinese
commity --lang zh
# Include emojis
commity --emoji
# Use OpenRouter with specific model
commity --provider openrouter --model anthropic/claude-3.5-sonnet --api_key <your-openrouter-api-key>
# Use OpenRouter with emoji support
commity --provider openrouter --model openai/gpt-4o --api_key <your-openrouter-api-key> --emoji
Raw data
{
"_id": null,
"home_page": null,
"name": "commity",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.12",
"maintainer_email": null,
"keywords": "ai, commit, commity, conventional-commits, gemini, git, llm, message, ollama, openai",
"author": "freboe, skyci",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/89/1c/829252efeb13fc1dd6392e3c683e35168cc22f1920b9bef5b7eaa2660b04/commity-0.1.13.tar.gz",
"platform": null,
"description": "# \ud83e\udd16 commity\n\n[](https://pypi.org/project/commity)\n[](https://pypi.org/project/commity)\n[](https://github.com/freboe/commity/blob/main/LICENSE)\n\n[](https://github.com/freboe/commity/blob/main/README.md) | [](https://github.com/freboe/commity/blob/main/README.zh.md)\n\nGenerate intelligent Git commit messages with AI. Supports Conventional Commits, emoji, and multiple LLM providers like OpenAI, Ollama, and Gemini.\n\n## \ud83e\udd14 What is Commity?\n\n**Commity** is an open-source, AI-powered Git commit message generation tool. It analyzes your staged code changes and automatically generates commit messages that follow the [**Conventional Commits**](https://www.conventionalcommits.org/) specification, and can even add emojis for you!\n\nWith a simple `commity --emoji` command, you can get a professional and clear commit message like this:\n\n```\nfeat(api): \u2728 add user authentication endpoint\n```\n\n## \ud83d\udd27 Installation\n\nInstall with `pip`:\n\n```bash\npip install commity\n```\n\nOr install with `uv`:\n\n```bash\nuv tool install commity\n```\n\n## \u2699\ufe0f Configuration\n\n`commity` supports three configuration methods, with the following priority: **Command-line Arguments > Environment\nVariables > Configuration File**.\n\nSupported model providers are: `Gemini` (default), `Ollama`, `OpenAI`, `OpenRouter`.\n\n### \u2728 Method 1: Specify Model Parameters via Command-line\n\n#### OpenAI\n\n```Bash\ncommity --provider openai --model gpt-3.5-turbo --api_key <your-api-key>\n```\n\n#### Ollama\n\n```Bash\ncommity --provider ollama --model llama2 --base_url http://localhost:11434\n```\n\n#### Gemini\n\n```Bash\ncommity --provider gemini --model gemini-2.5-flash --base_url https://generativelanguage.googleapis.com --api_key <your-api-key> --timeout 30\n```\n\nor\n\n```Bash\ncommity \\\n--provider gemini \\\n--model gemini-2.5-flash \\\n--base_url https://generativelanguage.googleapis.com \\\n--api_key <your-api-key> \\\n--timeout 30 \\\n```\n\n#### OpenRouter\n\n```Bash\ncommity --provider openrouter --model openai/gpt-3.5-turbo --api_key <your-openrouter-api-key>\n```\n\nor\n\n```Bash\ncommity \\\n--provider openrouter \\\n--model anthropic/claude-3.5-sonnet \\\n--api_key <your-openrouter-api-key> \\\n```\n\n### \ud83c\udf31 Method 2: Set Environment Variables as Defaults\n\nYou can add the following to your `.bashrc`, `.zshrc`, or `.env` file:\n\n#### OpenAI\n\n```Bash\nexport COMMITY_PROVIDER=openai\nexport COMMITY_MODEL=gpt-3.5-turbo\nexport COMMITY_API_KEY=your-api-key\n```\n\n#### Ollama\n\n```Bash\nexport COMMITY_PROVIDER=ollama\nexport COMMITY_MODEL=llama2\nexport COMMITY_BASE_URL=http://localhost:11434\n```\n\n#### Gemini\n\n```Bash\nexport COMMITY_PROVIDER=gemini\nexport COMMITY_MODEL=gemini-2.5-flash\nexport COMMITY_BASE_URL=https://generativelanguage.googleapis.com\nexport COMMITY_API_KEY=your-api-key\nexport COMMITY_TEMPERATURE=0.5\n```\n\n#### OpenRouter\n\n```Bash\nexport COMMITY_PROVIDER=openrouter\nexport COMMITY_MODEL=openai/gpt-3.5-turbo\nexport COMMITY_API_KEY=your-openrouter-api-key\nexport COMMITY_TEMPERATURE=0.5\n```\n\n### \ud83d\udcdd Method 3: Use a Configuration File (Recommended)\n\nFor easier configuration management, you can create a `~/.commity/config.json` file in your user's home directory.\n\n1. Create the directory:\n\n ```bash\n mkdir -p ~/.commity\n ```\n\n2. Create and edit the `config.json` file:\n\n ```bash\n touch ~/.commity/config.json\n ```\n\n3. Add your configuration to `config.json`, for example:\n\n ```json\n {\n \"PROVIDER\": \"ollama\",\n \"MODEL\": \"llama3\",\n \"BASE_URL\": \"http://localhost:11434\"\n }\n ```\n\n Or using Gemini:\n\n ```json\n {\n \"PROVIDER\": \"gemini\",\n \"MODEL\": \"gemini-1.5-flash\",\n \"BASE_URL\": \"https://generativelanguage.googleapis.com\",\n \"API_KEY\": \"your-gemini-api-key\"\n }\n ```\n\n Or using OpenAI:\n\n ```json\n {\n \"PROVIDER\": \"openai\",\n \"MODEL\": \"gpt-3.5-turbo\",\n \"API_KEY\": \"your-openai-api-key\"\n }\n ```\n\n Or using OpenRouter:\n\n ```json\n {\n \"PROVIDER\": \"openrouter\",\n \"MODEL\": \"openai/gpt-3.5-turbo\",\n \"API_KEY\": \"your-openrouter-api-key\"\n }\n ```\n\n## \ud83d\ude80 Usage\n\n```Bash\n# Run\ncommity\n\n# View help\ncommity --help\n\n# Use Chinese\ncommity --lang zh\n\n# Include emojis\ncommity --emoji\n\n# Use OpenRouter with specific model\ncommity --provider openrouter --model anthropic/claude-3.5-sonnet --api_key <your-openrouter-api-key>\n\n# Use OpenRouter with emoji support\ncommity --provider openrouter --model openai/gpt-4o --api_key <your-openrouter-api-key> --emoji\n",
"bugtrack_url": null,
"license": null,
"summary": "Generate intelligent Git commit messages with AI. Supports Conventional Commits, emoji, and multiple LLM providers like OpenAI, Ollama, and Gemini.",
"version": "0.1.13",
"project_urls": {
"Documentation": "https://github.com/freboe/commity/blob/main/README.md",
"Homepage": "https://github.com/freboe/commity",
"Source": "https://github.com/freboe/commity",
"Tracker": "https://github.com/freboe/commity/issues"
},
"split_keywords": [
"ai",
" commit",
" commity",
" conventional-commits",
" gemini",
" git",
" llm",
" message",
" ollama",
" openai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "a0c7b462975d29b0de4e9c92b1d6be2b768f70feda59fcc89413d60e88d54b6f",
"md5": "7ae6dcac0b54bffb3293ac2d1574393a",
"sha256": "3a6cc435895883d6fdb1201a505b006e4df5d43fe895ecfa901f800dbc01144f"
},
"downloads": -1,
"filename": "commity-0.1.13-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7ae6dcac0b54bffb3293ac2d1574393a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.12",
"size": 23902,
"upload_time": "2025-08-16T15:33:57",
"upload_time_iso_8601": "2025-08-16T15:33:57.416666Z",
"url": "https://files.pythonhosted.org/packages/a0/c7/b462975d29b0de4e9c92b1d6be2b768f70feda59fcc89413d60e88d54b6f/commity-0.1.13-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "891c829252efeb13fc1dd6392e3c683e35168cc22f1920b9bef5b7eaa2660b04",
"md5": "56b425bcbebb37212e185ad5f99c3c85",
"sha256": "904c829ad10b15a56cb730881122f069ce73d425a2fce2f56812bfccf3088a72"
},
"downloads": -1,
"filename": "commity-0.1.13.tar.gz",
"has_sig": false,
"md5_digest": "56b425bcbebb37212e185ad5f99c3c85",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.12",
"size": 69962,
"upload_time": "2025-08-16T15:34:00",
"upload_time_iso_8601": "2025-08-16T15:34:00.684622Z",
"url": "https://files.pythonhosted.org/packages/89/1c/829252efeb13fc1dd6392e3c683e35168cc22f1920b9bef5b7eaa2660b04/commity-0.1.13.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-16 15:34:00",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "freboe",
"github_project": "commity",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "commity"
}