# Commify
Commify is a command-line interface (CLI) tool that generates meaningful, structured commit messages for Git repositories using AI. By analyzing the staged changes (diff) in your repository, it creates commit messages that follow conventional commit guidelines, optionally including emojis for better context and readability. See [Commify](https://matuco19.com/Commify) website to know more. Don't forget to ⭐ the project!
>
>Ollama provider can be slow without a good GPU or a very large AI model. It's not a Commify optimization problem.
<!-- space -->
>
> <sup><strong>Latest version:</strong></sup> [![PyPI version](https://img.shields.io/pypi/v/Commify?color=blue)](https://pypi.org/project/Commify)
> <sup><strong>Stats:</strong></sup> [![Downloads](https://static.pepy.tech/badge/Commify)](https://pepy.tech/project/Commify) [![Downloads](https://static.pepy.tech/badge/Commify/month)](https://pepy.tech/project/Commify)
---
## ✨ Features
- **AI-Powered Commit Messages:** Generate concise and structured commit messages using the `ollama` local AI provider or `G4F` AI provider.
- **Emoji Support:** Optionally include relevant emojis in commit messages.
- **Language Support:** Generate commit messages in the language of your choice.
- **Customizable Providers:** Specify the AI provider to use (g4f or ollama).
- **Interactive Review System:** Review and approve generated messages or request new ones.
- **Customizable Models:** Specify the AI model to use.
---
## 🛠️ Installation
### Windows
Make sure you have installed `Git`, `python3.10+` and `ollama` (ollama is optional)
Run the following:
```bash
pip install Commify
```
### Linux
Make sure you have installed `Git`, `python3.10+`, `pipx` and `ollama` (ollama is optional)
If don't, use this command:
```bash
sudo apt install git
sudo apt install pipx
```
And install Commify:
```bash
pipx install Commify
pipx ensurepath
```
After that, restart your terminal and you will already have Commify installed.
---
## 🏗️ Usage
Run the `commify` CLI with the desired options:
```bash
commify <path_to_repo> [--lang <language>] [--emoji <True/False>] [--model <AI_model>] [--provider <AI_PROVIDER>]
```
### Examples
Using Ollama Provider:
```bash
commify /path/to/repo --lang english --emoji True --model llama3.1 --provider ollama
```
Using G4F Provider:
```bash
commify /path/to/repo --lang english --emoji True --model gpt-4o --provider g4f
```
Without Specifying The Repository Path:
```bash
cd /path/to/repo
commify --lang english --emoji True --model llama3.1 --provider ollama
```
### Arguments
- **`path`:** Path to the Git repository. (If the repository path is not specified, the path Commify is running from will be used)
- **`--lang`:** Language for the commit message (default: `english`).
- **`--provider`:** AI provider to use for generating messages (default: `ollama`). (required)
- **`--emoji`:** Include emojis in the commit message (`True` or `False`, default: `True`).
- **`--model`:** AI model to use for generating messages (default: `llama3.1`). (required)
- **`--help`:** Display all available parameters and their descriptions.
- **`--version`:** Display the installed Commify version.
---
## 💡 Features in Detail
### Commit Message Review
Once a message is generated, you'll be prompted to:
- **Accept** the message (`y`).
- **Reject** the message will be generated again (`n`).
- **Cancel** the message (`c`).
### Commify Providers
Commify currently supports only 2 providers:
- [ollama](https://ollama.com/): Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine.
- [gpt4free](https://github.com/xtekky/gpt4free): gpt4free is an AI-Based Software Package that Reverse-Engineers APIs to Grant Anyone Free Access to Popular and powerful AI Models.
Feel free to submit a pull request or open an issue to add more providers!
---
## 🧩 Testing Information
Confirmed successful runs (with no errors) on the following:
- **OS:**
- Windows11
- Windows10
- Ubuntu24.04.1LTS
- Linux Mint 22
- **Python:**
- Python 3.11.9
- Python 3.12.3
- **AI Models:**
- llama3.2-vision `Ollama`
- llama3.1 `Ollama`
- dolphin-llama3 `Ollama`
- gpt-4o `G4F`
- gpt-4o-mini `G4F`
Let us know if it runs on your machine too!
---
## 💻 Developer Information
Commify is developed and maintained by **Matuco19**.
- Matuco19 Website: [matuco19.com](https://matuco19.com)
- GitHub: [github.com/Matuco19](https://github.com/Matuco19)
- Discord Server: [discord.gg/Matuco19Server0](https://discord.gg/hp7yCxHJBw)
---
## 📑 License
~~![License-MATCO Open Source V1](https://img.shields.io/badge/License-MATCO_Open_Source_V1-blue.svg)~~
~~This project is open-source and available under the [MATCO-Open-Source License](https://matuco19.com/licenses/MATCO-Open-Source). See the `LICENSE` file for details.~~
Apache License 2.0
---
### 👋 Contributions
Contributions are welcome! Feel free to open an issue or submit a pull request on [GitHub](https://github.com/Matuco19/commify).
---
Start making commits with **Commify** today! 🎉
Raw data
{
"_id": null,
"home_page": "https://matuco19.com/Commify",
"name": "Commify",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "commify, python, ai, commit, commits, git, github, gpt, language-model, automation, commits, gpt-4, gpt4, ollama, ollama-api, llama3, llama3.1, llama3.2, llama3.3, matuco19, openai, python3, gitpython",
"author": "Matuco19",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/10/b1/8d117ef74af990ed3b5e9cd1e24b862bb29360d5d5090638dff4102371fc/commify-1.4.5.2.tar.gz",
"platform": null,
"description": "\n# Commify\n\nCommify is a command-line interface (CLI) tool that generates meaningful, structured commit messages for Git repositories using AI. By analyzing the staged changes (diff) in your repository, it creates commit messages that follow conventional commit guidelines, optionally including emojis for better context and readability. See [Commify](https://matuco19.com/Commify) website to know more. Don't forget to \u2b50 the project!\n\n>\n>Ollama provider can be slow without a good GPU or a very large AI model. It's not a Commify optimization problem. \n\n<!-- space -->\n> \n> <sup><strong>Latest version:</strong></sup> [![PyPI version](https://img.shields.io/pypi/v/Commify?color=blue)](https://pypi.org/project/Commify) \n> <sup><strong>Stats:</strong></sup> [![Downloads](https://static.pepy.tech/badge/Commify)](https://pepy.tech/project/Commify) [![Downloads](https://static.pepy.tech/badge/Commify/month)](https://pepy.tech/project/Commify) \n\n---\n\n## \u2728 Features\n\n- **AI-Powered Commit Messages:** Generate concise and structured commit messages using the `ollama` local AI provider or `G4F` AI provider.\n- **Emoji Support:** Optionally include relevant emojis in commit messages.\n- **Language Support:** Generate commit messages in the language of your choice.\n- **Customizable Providers:** Specify the AI provider to use (g4f or ollama).\n- **Interactive Review System:** Review and approve generated messages or request new ones.\n- **Customizable Models:** Specify the AI model to use.\n\n---\n\n## \ud83d\udee0\ufe0f Installation\n\n### Windows\n\nMake sure you have installed `Git`, `python3.10+` and `ollama` (ollama is optional)\nRun the following:\n\n```bash\npip install Commify\n```\n\n### Linux\n\nMake sure you have installed `Git`, `python3.10+`, `pipx` and `ollama` (ollama is optional)\nIf don't, use this command:\n\n```bash\nsudo apt install git\nsudo apt install pipx\n```\n\nAnd install Commify:\n\n```bash\npipx install Commify\npipx ensurepath\n```\n\nAfter that, restart your terminal and you will already have Commify installed.\n\n---\n\n## \ud83c\udfd7\ufe0f Usage\n\nRun the `commify` CLI with the desired options:\n\n```bash\ncommify <path_to_repo> [--lang <language>] [--emoji <True/False>] [--model <AI_model>] [--provider <AI_PROVIDER>]\n```\n\n### Examples\n\nUsing Ollama Provider:\n\n```bash\ncommify /path/to/repo --lang english --emoji True --model llama3.1 --provider ollama\n```\n\nUsing G4F Provider:\n\n```bash\ncommify /path/to/repo --lang english --emoji True --model gpt-4o --provider g4f\n```\n\nWithout Specifying The Repository Path:\n\n```bash\ncd /path/to/repo\ncommify --lang english --emoji True --model llama3.1 --provider ollama\n```\n\n### Arguments\n\n- **`path`:** Path to the Git repository. (If the repository path is not specified, the path Commify is running from will be used)\n- **`--lang`:** Language for the commit message (default: `english`).\n- **`--provider`:** AI provider to use for generating messages (default: `ollama`). (required)\n- **`--emoji`:** Include emojis in the commit message (`True` or `False`, default: `True`).\n- **`--model`:** AI model to use for generating messages (default: `llama3.1`). (required)\n- **`--help`:** Display all available parameters and their descriptions.\n- **`--version`:** Display the installed Commify version.\n\n---\n\n## \ud83d\udca1 Features in Detail\n\n### Commit Message Review\n\nOnce a message is generated, you'll be prompted to:\n\n- **Accept** the message (`y`).\n- **Reject** the message will be generated again (`n`).\n- **Cancel** the message (`c`).\n\n### Commify Providers\n\nCommify currently supports only 2 providers:\n\n- [ollama](https://ollama.com/): Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine.\n- [gpt4free](https://github.com/xtekky/gpt4free): gpt4free is an AI-Based Software Package that Reverse-Engineers APIs to Grant Anyone Free Access to Popular and powerful AI Models.\n\nFeel free to submit a pull request or open an issue to add more providers!\n\n---\n\n## \ud83e\udde9 Testing Information\n\nConfirmed successful runs (with no errors) on the following:\n\n- **OS:**\n - Windows11\n - Windows10\n - Ubuntu24.04.1LTS\n - Linux Mint 22\n\n- **Python:**\n - Python 3.11.9\n - Python 3.12.3\n\n- **AI Models:**\n - llama3.2-vision `Ollama`\n - llama3.1 `Ollama`\n - dolphin-llama3 `Ollama`\n - gpt-4o `G4F`\n - gpt-4o-mini `G4F`\n\nLet us know if it runs on your machine too!\n\n---\n\n## \ud83d\udcbb Developer Information\n\nCommify is developed and maintained by **Matuco19**.\n\n- Matuco19 Website: [matuco19.com](https://matuco19.com) \n- GitHub: [github.com/Matuco19](https://github.com/Matuco19)\n- Discord Server: [discord.gg/Matuco19Server0](https://discord.gg/hp7yCxHJBw)\n\n---\n\n## \ud83d\udcd1 License\n\n~~![License-MATCO Open Source V1](https://img.shields.io/badge/License-MATCO_Open_Source_V1-blue.svg)~~\n\n~~This project is open-source and available under the [MATCO-Open-Source License](https://matuco19.com/licenses/MATCO-Open-Source). See the `LICENSE` file for details.~~\n\nApache License 2.0\n\n---\n\n### \ud83d\udc4b Contributions\n\nContributions are welcome! Feel free to open an issue or submit a pull request on [GitHub](https://github.com/Matuco19/commify).\n\n---\n\nStart making commits with **Commify** today! \ud83c\udf89\n",
"bugtrack_url": null,
"license": "Apache License 2.0",
"summary": "Commify: You Should Commit Yourself.",
"version": "1.4.5.2",
"project_urls": {
"Bug Tracker": "https://github.com/Matuco19/Commify/issues",
"Homepage": "https://matuco19.com/Commify",
"Source Code": "https://github.com/Matuco19/Commify"
},
"split_keywords": [
"commify",
" python",
" ai",
" commit",
" commits",
" git",
" github",
" gpt",
" language-model",
" automation",
" commits",
" gpt-4",
" gpt4",
" ollama",
" ollama-api",
" llama3",
" llama3.1",
" llama3.2",
" llama3.3",
" matuco19",
" openai",
" python3",
" gitpython"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f2a37b0475b8aade9addb5f658d2d68ffd833c279580f9209e887a2e9c8a570e",
"md5": "7aa582224afa5b9d83a37b39acd6e032",
"sha256": "541de337c49ad8cd18a78a3eb4016394acf1e18ce681a988526696ebdeca5dc6"
},
"downloads": -1,
"filename": "Commify-1.4.5.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7aa582224afa5b9d83a37b39acd6e032",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 11818,
"upload_time": "2024-12-13T18:01:31",
"upload_time_iso_8601": "2024-12-13T18:01:31.244250Z",
"url": "https://files.pythonhosted.org/packages/f2/a3/7b0475b8aade9addb5f658d2d68ffd833c279580f9209e887a2e9c8a570e/Commify-1.4.5.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "10b18d117ef74af990ed3b5e9cd1e24b862bb29360d5d5090638dff4102371fc",
"md5": "0e2e304544754a77bffb0b2db2cc57b0",
"sha256": "a3557a88c0dcd6dd3ca032657e2429ace04dde6a703a4895059137fd867f3705"
},
"downloads": -1,
"filename": "commify-1.4.5.2.tar.gz",
"has_sig": false,
"md5_digest": "0e2e304544754a77bffb0b2db2cc57b0",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 11166,
"upload_time": "2024-12-13T18:01:32",
"upload_time_iso_8601": "2024-12-13T18:01:32.152680Z",
"url": "https://files.pythonhosted.org/packages/10/b1/8d117ef74af990ed3b5e9cd1e24b862bb29360d5d5090638dff4102371fc/commify-1.4.5.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-13 18:01:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Matuco19",
"github_project": "Commify",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "setuptools",
"specs": []
},
{
"name": "ollama",
"specs": []
},
{
"name": "gitpython",
"specs": []
},
{
"name": "argparse",
"specs": []
},
{
"name": "g4f",
"specs": []
},
{
"name": "rich",
"specs": []
},
{
"name": "requests",
"specs": []
}
],
"lcname": "commify"
}