# GPTComet: AI-Powered Git Commit Message Generator And Reviewer
<a href="https://www.producthunt.com/posts/gptcomet?embed=true&utm_source=badge-featured&utm_medium=badge&utm_souce=badge-gptcomet" target="_blank"><img src="https://api.producthunt.com/widgets/embed-image/v1/featured.svg?post_id=774818&theme=light&t=1736583021458" alt="GPTComet - GPTComet: AI-Powered Git Commit Message Generator | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /></a>
[![PyPI version](https://img.shields.io/pypi/v/gptcomet?style=for-the-badge)](https://pypi.org/project/gptcomet/)
![GitHub Release](https://img.shields.io/github/v/release/belingud/gptcomet?style=for-the-badge)
[![License](https://img.shields.io/github/license/belingud/gptcomet.svg?style=for-the-badge)](https://opensource.org/licenses/MIT)
![GitHub go.mod Go version](https://img.shields.io/github/go-mod/go-version/belingud/gptcomet?style=for-the-badge)
![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/belingud/gptcomet/release.yml?style=for-the-badge)
![PyPI - Downloads](https://img.shields.io/pypi/dm/gptcomet?logo=pypi&style=for-the-badge)
![Pepy Total Downloads](https://img.shields.io/pepy/dt/gptcomet?style=for-the-badge&logo=python)
![GitHub Downloads (all assets, all releases)](https://img.shields.io/github/downloads/belingud/gptcomet/total?style=for-the-badge&label=Release%20Download)
<!-- TOC -->
- [GPTComet: AI-Powered Git Commit Message Generator And Reviewer](#gptcomet-ai-powered-git-commit-message-generator-and-reviewer)
- [💡 Overview](#-overview)
- [✨ Features](#-features)
- [⬇️ Installation](#-installation)
- [📕 Usage](#-usage)
- [🔧 Setup](#-setup)
- [Configuration Methods](#configuration-methods)
- [Provider Setup Guide](#provider-setup-guide)
- [OpenAI](#openai)
- [Gemini](#gemini)
- [Claude/Anthropic](#claudeanthropic)
- [Vertex](#vertex)
- [Azure](#azure)
- [Ollama](#ollama)
- [Other Supported Providers](#other-supported-providers)
- [Manual Provider Setup](#manual-provider-setup)
- [⌨️ Commands](#-commands)
- [⚙ Configuration](#%E2%9A%99-configuration)
- [file_ignore](#file_ignore)
- [provider](#provider)
- [output](#output)
- [Markdown theme](#markdown-theme)
- [Supported languages](#supported-languages)
- [console](#console)
- [🔦 Supported Keys](#-supported-keys)
- [📃 Example](#-example)
- [💻 Development](#-development)
- [📩 Contact](#-contact)
- [☕️ Sponsor](#%EF%B8%8F-sponsor)
- [📜 License](#-license)
<!-- /TOC -->
## 💡 Overview
GPTComet is a go library designed to automate the process of generating commit messages for Git repositories.
It leverages the power of AI to create meaningful commit messages based on the changes made in the codebase.
## ✨ Features
- **Automatic Commit Message Generation**: GPTComet can generate commit messages based on the changes made in the code.
- **Support for Multiple Languages**: GPTComet supports multiple languages, including English, Chinese and so on.
- **Customizable Configuration**: GPTComet allows users to customize the configuration to suit their needs, such llm model and prompt.
- **Support for Rich Commit Messages**: GPTComet supports rich commit messages, which include a title, summary, and detailed description.
- **Support for Multiple Providers**: GPTComet supports multiple providers, including OpenAI, Gemini, Claude/Anthropic, Vertex, Azure, Ollama, and others.
- **Support SVN and Git**: GPTComet supports both SVN and Git repositories.
## ⬇️ Installation
To use GPTComet, you can download from [Github release](https://github.com/belingud/gptcomet/releases/latest), or by install scripts:
```bash
curl -sSL https://cdn.jsdelivr.net/gh/belingud/gptcomet@master/install.sh | bash
```
Windows:
```powershell
irm https://cdn.jsdelivr.net/gh/belingud/gptcomet@master/install.ps1 | iex
```
If you prefer to run in python, you can install by `pip` directly, it packaged the binary files corresponding to the platform already.
```shell
pip install gptcomet
# Using pipx
pipx install gptcomet
# Using uv
uv tool install gptcomet
Resolved 1 package in 1.33s
Installed 1 package in 8ms
+ gptcomet==0.1.6
Installed 2 executables: gmsg, gptcomet
```
## 📕 Usage
To use gptcomet, follow these steps:
1. **Install GPTComet**: Install GPTComet through pypi.
2. **Configure GPTComet**: See [Setup](#setup). Configure GPTComet with your api_key and other required keys like:
- `provider`: The provider of the language model (default `openai`).
- `api_base`: The base URL of the API (default `https://api.openai.com/v1`).
- `api_key`: The API key for the provider.
- `model`: The model used for generating commit messages (default `gpt-4o`).
3. **Run GPTComet**: Run GPTComet using the following command: `gmsg commit`.
If you are using `openai` provider, and finished set `api_key`, you can run `gmsg commit` directly.
## 🔧 Setup
### Configuration Methods
1. **Direct Configuration**
- Configure directly in `~/.config/gptcomet/gptcomet.yaml`.
2. **Interactive Setup**
- Use the `gmsg newprovider` command for guided setup.
### Provider Setup Guide
![Made with VHS](https://vhs.charm.sh/vhs-6019QMIveifvh9vGKc2ZZ8.gif)
```bash
gmsg newprovider
Select Provider
> 1. azure
2. chatglm
3. claude
4. cohere
5. deepseek
6. gemini
7. groq
8. kimi
9. mistral
10. ollama
11. openai
12. openrouter
13. sambanova
14. silicon
15. tongyi
16. vertex
17. xai
18. Input Manually
↑/k up • ↓/j down • ? more
```
#### OpenAI
OpenAI api key page: https://platform.openai.com/api-keys
```shell
gmsg newprovider
Selected provider: openai
Configure provider:
Previous inputs:
Enter OpenAI API base: https://api.openai.com/v1
Enter API key: sk-abc*********************************************
Enter max tokens: 1024
Enter Enter model name (default: gpt-4o):
> gpt-4o
Provider openai configured successfully!
```
#### Gemini
Gemini api key page: https://aistudio.google.com/u/1/apikey
```shell
gmsg newprovider
Selected provider: gemini
Configure provider:
Previous inputs:
Enter Gemini API base: https://generativelanguage.googleapis.com/v1beta/models
Enter API key: AIz************************************
Enter max tokens: 1024
Enter Enter model name (default: gemini-1.5-flash):
> gemini-2.0-flash-exp
Provider gemini already has a configuration. Do you want to overwrite it? (y/N): y
Provider gemini configured successfully!
```
#### Claude/Anthropic
I don't have an anthropic account yet, please see [Anthropic console](https://console.anthropic.com)
#### Vertex
Vertex console page: https://console.cloud.google.com
```shell
gmsg newprovider
Selected provider: vertex
Configure provider:
Previous inputs:
Enter Vertex AI API Base URL: https://us-central1-aiplatform.googleapis.com/v1
Enter API key: sk-awz*********************************************
Enter location (e.g., us-central1): us-central1
Enter max tokens: 1024
Enter model name: gemini-1.5-pro
Enter Enter Google Cloud project ID:
> test-project
Provider vertex configured successfully!
```
#### Azure
```shell
gmsg newprovider
Selected provider: azure
Configure provider:
Previous inputs:
Enter Azure OpenAI endpoint: https://gptcomet.openai.azure.com
Enter API key: ********************************
Enter API version: 2024-02-15-preview
Enter Azure OpenAI deployment name: gpt4o
Enter max tokens: 1024
Enter Enter deployment name (default: gpt-4o):
> gpt-4o
Provider azure configured successfully!
```
#### Ollama
```shell
gmsg newprovider
Selected provider: ollama
Configure provider:
Previous inputs:
Enter Ollama API Base URL: http://localhost:11434/api
Enter max tokens: 1024
Enter Enter model name (default: llama2):
> llama2
Provider ollama configured successfully!
```
#### Other Supported Providers
- Groq
- Mistral
- Tongyi/Qwen
- XAI
- Sambanova
- Silicon
- Deepseek
- ChatGLM
- KIMI
- Cohere
- OpenRouter
Not supported:
- Baidu ERNIE
- Tecent hunyuan
### Manual Provider Setup
Or you can enter the provider name manually, and setup config manually.
```shell
gmsg newprovider
You can either select one from the list or enter a custom provider name.
...
vertex
> Input manually
Enter provider name: test
Enter OpenAI API Base URL [https://api.openai.com/v1]:
Enter model name [gpt-4o]:
Enter API key: ************************************
Enter max tokens [1024]:
[GPTComet] Provider test configured successfully.
```
Some special provider may need your custome config. Like `cloudflare`.
> Be aware that the model name is not used in cloudflare api.
```shell
$ gmsg newprovider
Selected provider: cloudflare
Configure provider:
Previous inputs:
Enter API Base URL: https://api.cloudflare.com/client/v4/accounts/<account_id>/ai/run
Enter model name: llama-3.3-70b-instruct-fp8-fast
Enter API key: abc*************************************
Enter Enter max tokens (default: 1024):
> 1024
Provider cloudflare already has a configuration. Do you want to overwrite it? (y/N): y
Provider cloudflare configured successfully!
$ gmsg config set cloudflare.completion_path @cf/meta/llama-3.3-70b-instruct-fp8-fast
$ gmsg config set cloudflare.answer_path result.response
```
## ⌨️ Commands
The following are the available commands for GPTComet:
- `gmsg config`: Config manage commands group.
- `get <key>`: Get the value of a configuration key.
- `list`: List the entire configuration content.
- `reset`: Reset the configuration to default values (optionally reset only the prompt section with `--prompt`).
- `set <key> <value>`: Set a configuration value.
- `path`: Get the configuration file path.
- `remove <key> [value]`: Remove a configuration key or a value from a list. (List value only, like `fileignore`)
- `append <key> <value>`: Append a value to a list configuration.(List value only, like `fileignore`)
- `keys`: List all supported configuration keys.
- `gmsg commit`: Generate commit message by changes/diff.
- `--svn`: Generate commit message for svn.
- `--dry-run`: Dry run the command without actually generating the commit message.
- `-y/--yes`: Skip the confirmation prompt.
- `gmsg newprovider`: Add a new provider.
- `gmsg review`: Review staged diff or pipe to `gmsg review`.
- `--svn`: Get diff from svn.
Global flags:
```shell
-c, --config string Config file path
-d, --debug Enable debug mode
```
## ⚙ Configuration
Here's a summary of the main configuration keys:
| Key | Description | Default Value |
| :----------------------------- | :--------------------------------------------------------- | :-------------------------------- |
| `provider` | The name of the LLM provider to use. | `openai` |
| `file_ignore` | A list of file patterns to ignore in the diff. | (See [file_ignore](#file_ignore)) |
| `output.lang` | The language for commit message generation. | `en` |
| `output.rich_template` | The template to use for rich commit messages. | `<title>:<summary>\n\n<detail>` |
| `output.translate_title` | Translate the title of the commit message. | `false` |
| `output.review_lang` | The language to generate the review message. | `en` |
| `output.markdown_theme` | The theme to display markdown_theme content. | `auto` |
| `console.verbose` | Enable verbose output. | `true` |
| `<provider>.api_base` | The API base URL for the provider. | (Provider-specific) |
| `<provider>.api_key` | The API key for the provider. | |
| `<provider>.model` | The model name to use. | (Provider-specific) |
| `<provider>.retries` | The number of retry attempts for API requests. | `2` |
| `<provider>.proxy` | The proxy URL to use (if needed). | |
| `<provider>.max_tokens` | The maximum number of tokens to generate. | `2048` |
| `<provider>.top_p` | The top-p value for nucleus sampling. | `0.7` |
| `<provider>.temperature` | The temperature value for controlling randomness. | `0.7` |
| `<provider>.frequency_penalty` | The frequency penalty value. | `0` |
| `<provider>.extra_headers` | Extra headers to include in API requests (JSON string). | `{}` |
| `<provider>.completion_path` | The API path for completion requests. | (Provider-specific) |
| `<provider>.answer_path` | The JSON path to extract the answer from the API response. | (Provider-specific) |
| `prompt.brief_commit_message` | The prompt template for generating brief commit messages. | (See `defaults/defaults.go`) |
| `prompt.rich_commit_message` | The prompt template for generating rich commit messages. | (See `defaults/defaults.go`) |
| `prompt.translation` | The prompt template for translating commit messages. | (See `defaults/defaults.go`) |
**Note:** `<provider>` should be replaced with the actual provider name (e.g., `openai`, `gemini`, `claude`).
Some providers require specific keys, such as Vertex needing project ID, location, etc.
The configuration file for GPTComet is `gptcomet.yaml`. The file should contain the following keys:
`output.translate_title` is used to determine whether to translate the title of the commit message.
For example in `output.lang: zh-cn`, the title of the commit message is `feat: Add new feature`
If `output.translate_title` is set to `true`, the commit message will be translated to `功能:新增功能`.
Otherwise, the commit message will be translated to `feat: 新增功能`.
In some case you can set `complation_path` to empty string, like `<provider>.completion_path: ""`, to use `api_base` endpoint directly.
### file_ignore
The file to ignore when generating a commit. The default value is
```yaml
- bun.lockb
- Cargo.lock
- composer.lock
- Gemfile.lock
- package-lock.json
- pnpm-lock.yaml
- poetry.lock
- yarn.lock
- pdm.lock
- Pipfile.lock
- "*.py[cod]"
- go.sum
- uv.lock
```
You can add more file_ignore by using the `gmsg config append file_ignore <xxx>` command.
`<xxx>` is same syntax as `gitignore`, like `*.so` to ignore all `.so` suffix files.
### provider
The provider configuration of the language model.
The default provider is `openai`.
Provider config just like:
```yaml
provider: openai
openai:
api_base: https://api.openai.com/v1
api_key: YOUR_API_KEY
model: gpt-4o
retries: 2
max_tokens: 1024
temperature: 0.7
top_p: 0.7
frequency_penalty: 0
extra_headers: {}
answer_path: choices.0.message.content
completion_path: /chat/completions
```
If you are using `openai`, just leave the `api_base` as default. Set your `api_key` in the `config` section.
If you are using an `openai` class provider, or a provider compatible interface, you can set the provider to `openai`.
And set your custom `api_base`, `api_key` and `model`.
For example:
`Openrouter` providers api interface compatible with openai,
you can set provider to `openai` and set `api_base` to `https://openrouter.ai/api/v1`,
`api_key` to your api key from [keys page](https://openrouter.ai/settings/keys)
and `model` to `meta-llama/llama-3.1-8b-instruct:free` or some other you prefer.
```shell
gmsg config set openai.api_base https://openrouter.ai/api/v1
gmsg config set openai.api_key YOUR_API_KEY
gmsg config set openai.model meta-llama/llama-3.1-8b-instruct:free
gmsg config set openai.max_tokens 1024
```
Silicon providers the similar interface with openrouter, so you can set provider to `openai`
and set `api_base` to `https://api.siliconflow.cn/v1`.
**Note that max tokens may vary, and will return an error if it is too large.**
### output
The output configuration of the commit message.
The default output is
```yaml
output:
lang: en
rich_template: "<title>:<summary>\n\n<detail>"
translate_title: false
review_lang: "en"
markdown_theme: "auto"
```
You can set `rich_template` to change the template of the rich commit message,
and set `lang` to change the language of the commit message.
### Markdown theme
Supported markdown theme:
- `auto`: Auto detect markdown theme (default).
- `ascii`: ASCII style.
- `dark`: Dark theme.
- `dracula`: Dracula theme.
- `light`: Light theme.
- `tokyo-night`: Tokyo Night theme.
- `notty`: Notty style, no render.
- `pink`: Pink theme.
If you not set `markdown_theme`, the markdown theme will be auto detected.
If you are using light terminal, the markdown theme will be `dark`, if you are using dark terminal, the markdown theme will be `light`.
GPTComet is using glamour to render markdown, you can preview the markdown theme in [glamour preview](https://github.com/charmbracelet/glamour/tree/master/styles/gallery#glamour-style-section).
### Supported languages
`output.lang` and `output.review_lang` support the following languages:
- `en`: English
- `zh-cn`: Simplified Chinese
- `zh-tw`: Traditional Chinese
- `fr`: French
- `vi`: Vietnamese
- `ja`: Japanese
- `ko`: Korean
- `ru`: Russian
- `tr`: Turkish
- `id`: Indonesian
- `th`: Thai
- `de`: German
- `es`: Spanish
- `pt`: Portuguese
- `it`: Italian
- `ar`: Arabic
- `hi`: Hindi
- `el`: Greek
- `pl`: Polish
- `nl`: Dutch
- `sv`: Swedish
- `fi`: Finnish
- `hu`: Hungarian
- `cs`: Czech
- `ro`: Romanian
- `bg`: Bulgarian
- `uk`: Ukrainian
- `he`: Hebrew
- `lt`: Lithuanian
- `la`: Latin
- `ca`: Catalan
- `sr`: Serbian
- `sl`: Slovenian
- `mk`: Macedonian
- `lv`: Latvian
### console
The console output config.
The default console is
```yaml
console:
verbose: true
```
When `verbose` is true, more information will be printed in the console.
## 🔦 Supported Keys
You can use `gmsg config keys` to check supported keys.
## 📃 Example
Here is an example of how to use GPTComet:
1. When you first set your OpenAI KEY by `gmsg config set openai.api_key YOUR_API_KEY`, it will generate config file at `~/.local/gptcomet/gptcomet.yaml`, includes:
```
provider: "openai"
openai:
api_base: "https://api.openai.com/v1"
api_key: "YOUR_API_KEY"
model: "gpt-4o"
retries: 2
output:
lang: "en"
```
2. Run the following command to generate a commit message: `gmsg commit`
3. GPTComet will generate a commit message based on the changes made in the code and display it in the console.
Note: Replace `YOUR_API_KEY` with your actual API key for the provider.
## 💻 Development
If you'd like to contribute to GPTComet, feel free to fork this project and submit a pull request.
First, fork the project and clone your repo.
```shell
git clone https://github.com/<yourname>/gptcomet
```
Second, make sure you have `_`, you can install by `pip`, `brew` or other way in their [installation](https://docs.astral.sh/uv/getting-started/installation/) docs
Use `just` command install dependence, `just` is a handy way to save and run project-specific commands, `just` docs [https://github.com/casey/just](https://github.com/casey/just)
```shell
just install
```
## 📩 Contact
If you have any questions or suggestions, feel free to contact.
## ☕️ Sponsor
If you like GPTComet, you can buy me a coffee to support me. Any support can help the project go further.
[Buy Me A Coffee](./SPONSOR.md)
## 📜 License
GPTComet is licensed under the MIT License.
[![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2Fbelingud%2Fgptcomet.svg?type=large&issueType=license)](https://app.fossa.com/projects/git%2Bgithub.com%2Fbelingud%2Fgptcomet?ref=badge_large&issueType=license)
Raw data
{
"_id": null,
"home_page": null,
"name": "gptcomet",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": "ai, git, commit, message, ai-powered, commit-message, gptcomet, chatgpt",
"author": null,
"author_email": "belingud <im.victor@qq.com>",
"download_url": null,
"platform": null,
"description": "# GPTComet: AI-Powered Git Commit Message Generator And Reviewer\n\n<a href=\"https://www.producthunt.com/posts/gptcomet?embed=true&utm_source=badge-featured&utm_medium=badge&utm_souce=badge-gptcomet\" target=\"_blank\"><img src=\"https://api.producthunt.com/widgets/embed-image/v1/featured.svg?post_id=774818&theme=light&t=1736583021458\" alt=\"GPTComet - GPTComet: AI-Powered Git Commit Message Generator | Product Hunt\" style=\"width: 250px; height: 54px;\" width=\"250\" height=\"54\" /></a>\n\n[![PyPI version](https://img.shields.io/pypi/v/gptcomet?style=for-the-badge)](https://pypi.org/project/gptcomet/)\n![GitHub Release](https://img.shields.io/github/v/release/belingud/gptcomet?style=for-the-badge)\n[![License](https://img.shields.io/github/license/belingud/gptcomet.svg?style=for-the-badge)](https://opensource.org/licenses/MIT)\n![GitHub go.mod Go version](https://img.shields.io/github/go-mod/go-version/belingud/gptcomet?style=for-the-badge)\n![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/belingud/gptcomet/release.yml?style=for-the-badge)\n![PyPI - Downloads](https://img.shields.io/pypi/dm/gptcomet?logo=pypi&style=for-the-badge)\n![Pepy Total Downloads](https://img.shields.io/pepy/dt/gptcomet?style=for-the-badge&logo=python)\n![GitHub Downloads (all assets, all releases)](https://img.shields.io/github/downloads/belingud/gptcomet/total?style=for-the-badge&label=Release%20Download)\n\n<!-- TOC -->\n\n- [GPTComet: AI-Powered Git Commit Message Generator And Reviewer](#gptcomet-ai-powered-git-commit-message-generator-and-reviewer)\n - [\ud83d\udca1 Overview](#-overview)\n - [\u2728 Features](#-features)\n - [\u2b07\ufe0f Installation](#-installation)\n - [\ud83d\udcd5 Usage](#-usage)\n - [\ud83d\udd27 Setup](#-setup)\n - [Configuration Methods](#configuration-methods)\n - [Provider Setup Guide](#provider-setup-guide)\n - [OpenAI](#openai)\n - [Gemini](#gemini)\n - [Claude/Anthropic](#claudeanthropic)\n - [Vertex](#vertex)\n - [Azure](#azure)\n - [Ollama](#ollama)\n - [Other Supported Providers](#other-supported-providers)\n - [Manual Provider Setup](#manual-provider-setup)\n - [\u2328\ufe0f Commands](#-commands)\n - [\u2699 Configuration](#%E2%9A%99-configuration)\n - [file_ignore](#file_ignore)\n - [provider](#provider)\n - [output](#output)\n - [Markdown theme](#markdown-theme)\n - [Supported languages](#supported-languages)\n - [console](#console)\n - [\ud83d\udd26 Supported Keys](#-supported-keys)\n - [\ud83d\udcc3 Example](#-example)\n - [\ud83d\udcbb Development](#-development)\n - [\ud83d\udce9 Contact](#-contact)\n - [\u2615\ufe0f Sponsor](#%EF%B8%8F-sponsor)\n - [\ud83d\udcdc License](#-license)\n\n<!-- /TOC -->\n\n## \ud83d\udca1 Overview\n\nGPTComet is a go library designed to automate the process of generating commit messages for Git repositories.\nIt leverages the power of AI to create meaningful commit messages based on the changes made in the codebase.\n\n## \u2728 Features\n\n- **Automatic Commit Message Generation**: GPTComet can generate commit messages based on the changes made in the code.\n- **Support for Multiple Languages**: GPTComet supports multiple languages, including English, Chinese and so on.\n- **Customizable Configuration**: GPTComet allows users to customize the configuration to suit their needs, such llm model and prompt.\n- **Support for Rich Commit Messages**: GPTComet supports rich commit messages, which include a title, summary, and detailed description.\n- **Support for Multiple Providers**: GPTComet supports multiple providers, including OpenAI, Gemini, Claude/Anthropic, Vertex, Azure, Ollama, and others.\n- **Support SVN and Git**: GPTComet supports both SVN and Git repositories.\n\n## \u2b07\ufe0f Installation\n\nTo use GPTComet, you can download from [Github release](https://github.com/belingud/gptcomet/releases/latest), or by install scripts:\n\n```bash\ncurl -sSL https://cdn.jsdelivr.net/gh/belingud/gptcomet@master/install.sh | bash\n```\n\nWindows:\n\n```powershell\nirm https://cdn.jsdelivr.net/gh/belingud/gptcomet@master/install.ps1 | iex\n```\n\nIf you prefer to run in python, you can install by `pip` directly, it packaged the binary files corresponding to the platform already.\n\n```shell\npip install gptcomet\n\n# Using pipx\npipx install gptcomet\n\n# Using uv\nuv tool install gptcomet\nResolved 1 package in 1.33s\nInstalled 1 package in 8ms\n + gptcomet==0.1.6\nInstalled 2 executables: gmsg, gptcomet\n```\n\n## \ud83d\udcd5 Usage\n\nTo use gptcomet, follow these steps:\n\n1. **Install GPTComet**: Install GPTComet through pypi.\n2. **Configure GPTComet**: See [Setup](#setup). Configure GPTComet with your api_key and other required keys like:\n\n- `provider`: The provider of the language model (default `openai`).\n- `api_base`: The base URL of the API (default `https://api.openai.com/v1`).\n- `api_key`: The API key for the provider.\n- `model`: The model used for generating commit messages (default `gpt-4o`).\n\n3. **Run GPTComet**: Run GPTComet using the following command: `gmsg commit`.\n\nIf you are using `openai` provider, and finished set `api_key`, you can run `gmsg commit` directly.\n\n## \ud83d\udd27 Setup\n\n### Configuration Methods\n\n1. **Direct Configuration**\n\n - Configure directly in `~/.config/gptcomet/gptcomet.yaml`.\n\n2. **Interactive Setup**\n - Use the `gmsg newprovider` command for guided setup.\n\n### Provider Setup Guide\n\n![Made with VHS](https://vhs.charm.sh/vhs-6019QMIveifvh9vGKc2ZZ8.gif)\n\n```bash\ngmsg newprovider\n\n Select Provider\n\n > 1. azure\n 2. chatglm\n 3. claude\n 4. cohere\n 5. deepseek\n 6. gemini\n 7. groq\n 8. kimi\n 9. mistral\n 10. ollama\n 11. openai\n 12. openrouter\n 13. sambanova\n 14. silicon\n 15. tongyi\n 16. vertex\n 17. xai\n 18. Input Manually\n\n \u2191/k up \u2022 \u2193/j down \u2022 ? more\n```\n\n#### OpenAI\n\nOpenAI api key page: https://platform.openai.com/api-keys\n\n```shell\ngmsg newprovider\n\nSelected provider: openai\nConfigure provider:\n\nPrevious inputs:\n Enter OpenAI API base: https://api.openai.com/v1\n Enter API key: sk-abc*********************************************\n Enter max tokens: 1024\n\nEnter Enter model name (default: gpt-4o):\n> gpt-4o\n\n\nProvider openai configured successfully!\n```\n\n#### Gemini\n\nGemini api key page: https://aistudio.google.com/u/1/apikey\n\n```shell\ngmsg newprovider\nSelected provider: gemini\nConfigure provider:\n\nPrevious inputs:\n Enter Gemini API base: https://generativelanguage.googleapis.com/v1beta/models\n Enter API key: AIz************************************\n Enter max tokens: 1024\n\nEnter Enter model name (default: gemini-1.5-flash):\n> gemini-2.0-flash-exp\n\nProvider gemini already has a configuration. Do you want to overwrite it? (y/N): y\n\nProvider gemini configured successfully!\n```\n\n#### Claude/Anthropic\n\nI don't have an anthropic account yet, please see [Anthropic console](https://console.anthropic.com)\n\n#### Vertex\n\nVertex console page: https://console.cloud.google.com\n\n```shell\ngmsg newprovider\nSelected provider: vertex\nConfigure provider:\n\nPrevious inputs:\n Enter Vertex AI API Base URL: https://us-central1-aiplatform.googleapis.com/v1\n Enter API key: sk-awz*********************************************\n Enter location (e.g., us-central1): us-central1\n Enter max tokens: 1024\n Enter model name: gemini-1.5-pro\n\nEnter Enter Google Cloud project ID:\n> test-project\n\n\nProvider vertex configured successfully!\n```\n\n#### Azure\n\n```shell\ngmsg newprovider\n\nSelected provider: azure\nConfigure provider:\n\nPrevious inputs:\n Enter Azure OpenAI endpoint: https://gptcomet.openai.azure.com\n Enter API key: ********************************\n Enter API version: 2024-02-15-preview\n Enter Azure OpenAI deployment name: gpt4o\n Enter max tokens: 1024\n\nEnter Enter deployment name (default: gpt-4o):\n> gpt-4o\n\n\nProvider azure configured successfully!\n```\n\n#### Ollama\n\n```shell\ngmsg newprovider\nSelected provider: ollama\nConfigure provider:\n\nPrevious inputs:\n Enter Ollama API Base URL: http://localhost:11434/api\n Enter max tokens: 1024\n\nEnter Enter model name (default: llama2):\n> llama2\n\n\nProvider ollama configured successfully!\n```\n\n#### Other Supported Providers\n\n- Groq\n- Mistral\n- Tongyi/Qwen\n- XAI\n- Sambanova\n- Silicon\n- Deepseek\n- ChatGLM\n- KIMI\n- Cohere\n- OpenRouter\n\nNot supported:\n\n- Baidu ERNIE\n- Tecent hunyuan\n\n### Manual Provider Setup\n\nOr you can enter the provider name manually, and setup config manually.\n\n```shell\ngmsg newprovider\nYou can either select one from the list or enter a custom provider name.\n ...\n vertex\n> Input manually\n\nEnter provider name: test\nEnter OpenAI API Base URL [https://api.openai.com/v1]:\nEnter model name [gpt-4o]:\nEnter API key: ************************************\nEnter max tokens [1024]:\n[GPTComet] Provider test configured successfully.\n```\n\nSome special provider may need your custome config. Like `cloudflare`.\n\n> Be aware that the model name is not used in cloudflare api.\n\n```shell\n$ gmsg newprovider\n\nSelected provider: cloudflare\nConfigure provider:\n\nPrevious inputs:\n Enter API Base URL: https://api.cloudflare.com/client/v4/accounts/<account_id>/ai/run\n Enter model name: llama-3.3-70b-instruct-fp8-fast\n Enter API key: abc*************************************\n\nEnter Enter max tokens (default: 1024):\n> 1024\n\nProvider cloudflare already has a configuration. Do you want to overwrite it? (y/N): y\n\nProvider cloudflare configured successfully!\n\n$ gmsg config set cloudflare.completion_path @cf/meta/llama-3.3-70b-instruct-fp8-fast\n$ gmsg config set cloudflare.answer_path result.response\n```\n\n## \u2328\ufe0f Commands\n\nThe following are the available commands for GPTComet:\n\n- `gmsg config`: Config manage commands group.\n - `get <key>`: Get the value of a configuration key.\n - `list`: List the entire configuration content.\n - `reset`: Reset the configuration to default values (optionally reset only the prompt section with `--prompt`).\n - `set <key> <value>`: Set a configuration value.\n - `path`: Get the configuration file path.\n - `remove <key> [value]`: Remove a configuration key or a value from a list. (List value only, like `fileignore`)\n - `append <key> <value>`: Append a value to a list configuration.(List value only, like `fileignore`)\n - `keys`: List all supported configuration keys.\n- `gmsg commit`: Generate commit message by changes/diff.\n - `--svn`: Generate commit message for svn.\n - `--dry-run`: Dry run the command without actually generating the commit message.\n - `-y/--yes`: Skip the confirmation prompt.\n- `gmsg newprovider`: Add a new provider.\n- `gmsg review`: Review staged diff or pipe to `gmsg review`.\n - `--svn`: Get diff from svn.\n\nGlobal flags:\n\n```shell\n -c, --config string Config file path\n -d, --debug Enable debug mode\n```\n\n## \u2699 Configuration\n\nHere's a summary of the main configuration keys:\n\n| Key | Description | Default Value |\n| :----------------------------- | :--------------------------------------------------------- | :-------------------------------- |\n| `provider` | The name of the LLM provider to use. | `openai` |\n| `file_ignore` | A list of file patterns to ignore in the diff. | (See [file_ignore](#file_ignore)) |\n| `output.lang` | The language for commit message generation. | `en` |\n| `output.rich_template` | The template to use for rich commit messages. | `<title>:<summary>\\n\\n<detail>` |\n| `output.translate_title` | Translate the title of the commit message. | `false` |\n| `output.review_lang` | The language to generate the review message. | `en` |\n| `output.markdown_theme` | The theme to display markdown_theme content. | `auto` |\n| `console.verbose` | Enable verbose output. | `true` |\n| `<provider>.api_base` | The API base URL for the provider. | (Provider-specific) |\n| `<provider>.api_key` | The API key for the provider. | |\n| `<provider>.model` | The model name to use. | (Provider-specific) |\n| `<provider>.retries` | The number of retry attempts for API requests. | `2` |\n| `<provider>.proxy` | The proxy URL to use (if needed). | |\n| `<provider>.max_tokens` | The maximum number of tokens to generate. | `2048` |\n| `<provider>.top_p` | The top-p value for nucleus sampling. | `0.7` |\n| `<provider>.temperature` | The temperature value for controlling randomness. | `0.7` |\n| `<provider>.frequency_penalty` | The frequency penalty value. | `0` |\n| `<provider>.extra_headers` | Extra headers to include in API requests (JSON string). | `{}` |\n| `<provider>.completion_path` | The API path for completion requests. | (Provider-specific) |\n| `<provider>.answer_path` | The JSON path to extract the answer from the API response. | (Provider-specific) |\n| `prompt.brief_commit_message` | The prompt template for generating brief commit messages. | (See `defaults/defaults.go`) |\n| `prompt.rich_commit_message` | The prompt template for generating rich commit messages. | (See `defaults/defaults.go`) |\n| `prompt.translation` | The prompt template for translating commit messages. | (See `defaults/defaults.go`) |\n\n**Note:** `<provider>` should be replaced with the actual provider name (e.g., `openai`, `gemini`, `claude`).\n\nSome providers require specific keys, such as Vertex needing project ID, location, etc.\n\nThe configuration file for GPTComet is `gptcomet.yaml`. The file should contain the following keys:\n\n`output.translate_title` is used to determine whether to translate the title of the commit message.\n\nFor example in `output.lang: zh-cn`, the title of the commit message is `feat: Add new feature`\n\nIf `output.translate_title` is set to `true`, the commit message will be translated to `\u529f\u80fd\uff1a\u65b0\u589e\u529f\u80fd`.\nOtherwise, the commit message will be translated to `feat: \u65b0\u589e\u529f\u80fd`.\n\nIn some case you can set `complation_path` to empty string, like `<provider>.completion_path: \"\"`, to use `api_base` endpoint directly.\n\n### file_ignore\n\nThe file to ignore when generating a commit. The default value is\n\n```yaml\n- bun.lockb\n- Cargo.lock\n- composer.lock\n- Gemfile.lock\n- package-lock.json\n- pnpm-lock.yaml\n- poetry.lock\n- yarn.lock\n- pdm.lock\n- Pipfile.lock\n- \"*.py[cod]\"\n- go.sum\n- uv.lock\n```\n\nYou can add more file_ignore by using the `gmsg config append file_ignore <xxx>` command.\n`<xxx>` is same syntax as `gitignore`, like `*.so` to ignore all `.so` suffix files.\n\n### provider\n\nThe provider configuration of the language model.\n\nThe default provider is `openai`.\n\nProvider config just like:\n\n```yaml\nprovider: openai\nopenai:\n api_base: https://api.openai.com/v1\n api_key: YOUR_API_KEY\n model: gpt-4o\n retries: 2\n max_tokens: 1024\n temperature: 0.7\n top_p: 0.7\n frequency_penalty: 0\n extra_headers: {}\n answer_path: choices.0.message.content\n completion_path: /chat/completions\n```\n\nIf you are using `openai`, just leave the `api_base` as default. Set your `api_key` in the `config` section.\n\nIf you are using an `openai` class provider, or a provider compatible interface, you can set the provider to `openai`.\nAnd set your custom `api_base`, `api_key` and `model`.\n\nFor example:\n\n`Openrouter` providers api interface compatible with openai,\nyou can set provider to `openai` and set `api_base` to `https://openrouter.ai/api/v1`,\n`api_key` to your api key from [keys page](https://openrouter.ai/settings/keys)\nand `model` to `meta-llama/llama-3.1-8b-instruct:free` or some other you prefer.\n\n```shell\ngmsg config set openai.api_base https://openrouter.ai/api/v1\ngmsg config set openai.api_key YOUR_API_KEY\ngmsg config set openai.model meta-llama/llama-3.1-8b-instruct:free\ngmsg config set openai.max_tokens 1024\n```\n\nSilicon providers the similar interface with openrouter, so you can set provider to `openai`\nand set `api_base` to `https://api.siliconflow.cn/v1`.\n\n**Note that max tokens may vary, and will return an error if it is too large.**\n\n### output\n\nThe output configuration of the commit message.\n\nThe default output is\n\n```yaml\noutput:\n lang: en\n rich_template: \"<title>:<summary>\\n\\n<detail>\"\n translate_title: false\n review_lang: \"en\"\n markdown_theme: \"auto\"\n```\n\nYou can set `rich_template` to change the template of the rich commit message,\nand set `lang` to change the language of the commit message.\n\n### Markdown theme\n\nSupported markdown theme:\n\n- `auto`: Auto detect markdown theme (default).\n- `ascii`: ASCII style.\n- `dark`: Dark theme.\n- `dracula`: Dracula theme.\n- `light`: Light theme.\n- `tokyo-night`: Tokyo Night theme.\n- `notty`: Notty style, no render.\n- `pink`: Pink theme.\n\nIf you not set `markdown_theme`, the markdown theme will be auto detected.\nIf you are using light terminal, the markdown theme will be `dark`, if you are using dark terminal, the markdown theme will be `light`.\n\nGPTComet is using glamour to render markdown, you can preview the markdown theme in [glamour preview](https://github.com/charmbracelet/glamour/tree/master/styles/gallery#glamour-style-section).\n\n### Supported languages\n\n`output.lang` and `output.review_lang` support the following languages:\n\n- `en`: English\n- `zh-cn`: Simplified Chinese\n- `zh-tw`: Traditional Chinese\n- `fr`: French\n- `vi`: Vietnamese\n- `ja`: Japanese\n- `ko`: Korean\n- `ru`: Russian\n- `tr`: Turkish\n- `id`: Indonesian\n- `th`: Thai\n- `de`: German\n- `es`: Spanish\n- `pt`: Portuguese\n- `it`: Italian\n- `ar`: Arabic\n- `hi`: Hindi\n- `el`: Greek\n- `pl`: Polish\n- `nl`: Dutch\n- `sv`: Swedish\n- `fi`: Finnish\n- `hu`: Hungarian\n- `cs`: Czech\n- `ro`: Romanian\n- `bg`: Bulgarian\n- `uk`: Ukrainian\n- `he`: Hebrew\n- `lt`: Lithuanian\n- `la`: Latin\n- `ca`: Catalan\n- `sr`: Serbian\n- `sl`: Slovenian\n- `mk`: Macedonian\n- `lv`: Latvian\n\n### console\n\nThe console output config.\n\nThe default console is\n\n```yaml\nconsole:\n verbose: true\n```\n\nWhen `verbose` is true, more information will be printed in the console.\n\n## \ud83d\udd26 Supported Keys\n\nYou can use `gmsg config keys` to check supported keys.\n\n## \ud83d\udcc3 Example\n\nHere is an example of how to use GPTComet:\n\n1. When you first set your OpenAI KEY by `gmsg config set openai.api_key YOUR_API_KEY`, it will generate config file at `~/.local/gptcomet/gptcomet.yaml`, includes:\n\n```\nprovider: \"openai\"\nopenai:\n api_base: \"https://api.openai.com/v1\"\n api_key: \"YOUR_API_KEY\"\n model: \"gpt-4o\"\n retries: 2\noutput:\n lang: \"en\"\n```\n\n2. Run the following command to generate a commit message: `gmsg commit`\n3. GPTComet will generate a commit message based on the changes made in the code and display it in the console.\n\nNote: Replace `YOUR_API_KEY` with your actual API key for the provider.\n\n## \ud83d\udcbb Development\n\nIf you'd like to contribute to GPTComet, feel free to fork this project and submit a pull request.\n\nFirst, fork the project and clone your repo.\n\n```shell\ngit clone https://github.com/<yourname>/gptcomet\n```\n\nSecond, make sure you have `_`, you can install by `pip`, `brew` or other way in their [installation](https://docs.astral.sh/uv/getting-started/installation/) docs\n\nUse `just` command install dependence, `just` is a handy way to save and run project-specific commands, `just` docs [https://github.com/casey/just](https://github.com/casey/just)\n\n```shell\njust install\n```\n\n## \ud83d\udce9 Contact\n\nIf you have any questions or suggestions, feel free to contact.\n\n## \u2615\ufe0f Sponsor\n\nIf you like GPTComet, you can buy me a coffee to support me. Any support can help the project go further.\n\n[Buy Me A Coffee](./SPONSOR.md)\n\n## \ud83d\udcdc License\n\nGPTComet is licensed under the MIT License.\n\n[![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2Fbelingud%2Fgptcomet.svg?type=large&issueType=license)](https://app.fossa.com/projects/git%2Bgithub.com%2Fbelingud%2Fgptcomet?ref=badge_large&issueType=license)\n",
"bugtrack_url": null,
"license": "MIT license",
"summary": "GPTComet: AI-Powered Git Commit Message Generator.",
"version": "0.4.0",
"project_urls": {
"Documentation": "https://github.com/belingud/gptcomet",
"Homepage": "https://github.com/belingud/gptcomet",
"Repository": "https://github.com/belingud/gptcomet"
},
"split_keywords": [
"ai",
" git",
" commit",
" message",
" ai-powered",
" commit-message",
" gptcomet",
" chatgpt"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "bbe95aedb3716046de5a1d62754c0dbbd6a45afb0f460907e6e3a2d960e858b8",
"md5": "af3d5270f6681d0e3c52f0b0b66e5bba",
"sha256": "f69102032c77b1805f36b31e4719d9d16eb5acf278918531616d206a8919be50"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp310-cp310-macosx_13_0_x86_64.whl",
"has_sig": false,
"md5_digest": "af3d5270f6681d0e3c52f0b0b66e5bba",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": "<4.0,>=3.9",
"size": 5814746,
"upload_time": "2025-01-20T05:55:47",
"upload_time_iso_8601": "2025-01-20T05:55:47.561959Z",
"url": "https://files.pythonhosted.org/packages/bb/e9/5aedb3716046de5a1d62754c0dbbd6a45afb0f460907e6e3a2d960e858b8/gptcomet-0.4.0-cp310-cp310-macosx_13_0_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "770e4d32011b5db8a827b12d49608f17d479de866af1c92fd8bf73e5d431c9e5",
"md5": "b7622e191c447630a994af8f3388bd50",
"sha256": "f34ee66090db86f9005a589f4f4cc415dbd237aef54ddf8fdea98571f16a88f1"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp310-cp310-macosx_14_0_arm64.whl",
"has_sig": false,
"md5_digest": "b7622e191c447630a994af8f3388bd50",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": "<4.0,>=3.9",
"size": 5445182,
"upload_time": "2025-01-20T05:55:50",
"upload_time_iso_8601": "2025-01-20T05:55:50.097282Z",
"url": "https://files.pythonhosted.org/packages/77/0e/4d32011b5db8a827b12d49608f17d479de866af1c92fd8bf73e5d431c9e5/gptcomet-0.4.0-cp310-cp310-macosx_14_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "435b74d6927f1aa6865f73e0838523863796df841b99077ddc6a77a21a400521",
"md5": "6d54a715a1d7859f99229985914c4983",
"sha256": "66754f90cfdb3fb4de95a2ef12daf69956c94a66773e3616c1580eedbd2083c0"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp310-cp310-manylinux_2_31_x86_64.whl",
"has_sig": false,
"md5_digest": "6d54a715a1d7859f99229985914c4983",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": "<4.0,>=3.9",
"size": 5726723,
"upload_time": "2025-01-20T05:55:52",
"upload_time_iso_8601": "2025-01-20T05:55:52.136925Z",
"url": "https://files.pythonhosted.org/packages/43/5b/74d6927f1aa6865f73e0838523863796df841b99077ddc6a77a21a400521/gptcomet-0.4.0-cp310-cp310-manylinux_2_31_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1444500833edbdaf8f6e56abc658185a564d82ab7af056bf1dd2b7fec188fd29",
"md5": "6f3341064ce7ac36938caecfc07bc13e",
"sha256": "08a55a32285dceb18b6532cdb62ed53db4f2e15a52c5f3e803f5ab1293ac9273"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp310-cp310-win_amd64.whl",
"has_sig": false,
"md5_digest": "6f3341064ce7ac36938caecfc07bc13e",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": "<4.0,>=3.9",
"size": 5870884,
"upload_time": "2025-01-20T05:55:56",
"upload_time_iso_8601": "2025-01-20T05:55:56.523991Z",
"url": "https://files.pythonhosted.org/packages/14/44/500833edbdaf8f6e56abc658185a564d82ab7af056bf1dd2b7fec188fd29/gptcomet-0.4.0-cp310-cp310-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "697828fb6f6fdccce5b9a823e7d2df6f345c5c7fd6cd1702d7ccc17b81dd607a",
"md5": "cd171a13758a8ee7430adbb5d2209023",
"sha256": "7314c1ae6b647e86157136ebe7276f4537943209748f76dfd9548b0ce35d70d2"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp311-cp311-macosx_13_0_x86_64.whl",
"has_sig": false,
"md5_digest": "cd171a13758a8ee7430adbb5d2209023",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": "<4.0,>=3.9",
"size": 5814743,
"upload_time": "2025-01-20T05:55:59",
"upload_time_iso_8601": "2025-01-20T05:55:59.642131Z",
"url": "https://files.pythonhosted.org/packages/69/78/28fb6f6fdccce5b9a823e7d2df6f345c5c7fd6cd1702d7ccc17b81dd607a/gptcomet-0.4.0-cp311-cp311-macosx_13_0_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "31999bbc78a6cc3e95420cf655f4eebdd0b7a8ab2cc639f49dcdf73ee89783c6",
"md5": "41a11e34ab76322d64f1ab8f1dafe07f",
"sha256": "f524f682cc1e4b8d95a9d557c25d3d38ef40bbe52b975f229eae3b9cd1435c6b"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp311-cp311-macosx_14_0_arm64.whl",
"has_sig": false,
"md5_digest": "41a11e34ab76322d64f1ab8f1dafe07f",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": "<4.0,>=3.9",
"size": 5445182,
"upload_time": "2025-01-20T05:56:02",
"upload_time_iso_8601": "2025-01-20T05:56:02.479261Z",
"url": "https://files.pythonhosted.org/packages/31/99/9bbc78a6cc3e95420cf655f4eebdd0b7a8ab2cc639f49dcdf73ee89783c6/gptcomet-0.4.0-cp311-cp311-macosx_14_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "118296d2ceafce82b3ad886ee396e947b4441469ee9d67247fe1a77253deb11a",
"md5": "5933cac9a14234aa4a57af67c389eeed",
"sha256": "a03513e9b1ab5e45689b50dbbe7640382ae2eff7281f4c640d7cb42c066f7b9d"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp311-cp311-manylinux_2_31_x86_64.whl",
"has_sig": false,
"md5_digest": "5933cac9a14234aa4a57af67c389eeed",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": "<4.0,>=3.9",
"size": 5726720,
"upload_time": "2025-01-20T05:56:04",
"upload_time_iso_8601": "2025-01-20T05:56:04.312222Z",
"url": "https://files.pythonhosted.org/packages/11/82/96d2ceafce82b3ad886ee396e947b4441469ee9d67247fe1a77253deb11a/gptcomet-0.4.0-cp311-cp311-manylinux_2_31_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "ff4982d21a1ce2d5809d3324e71e16219eaee6e8f0cf3e9ad876e167660336d3",
"md5": "576a5a4d0dd4a11f40a78c6701dfddb0",
"sha256": "21d642823b45acfe793df26192c007bb6caf1e11e6f6ad7a3fb3e589da76d0ea"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp311-cp311-win_amd64.whl",
"has_sig": false,
"md5_digest": "576a5a4d0dd4a11f40a78c6701dfddb0",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": "<4.0,>=3.9",
"size": 5870886,
"upload_time": "2025-01-20T05:56:07",
"upload_time_iso_8601": "2025-01-20T05:56:07.039985Z",
"url": "https://files.pythonhosted.org/packages/ff/49/82d21a1ce2d5809d3324e71e16219eaee6e8f0cf3e9ad876e167660336d3/gptcomet-0.4.0-cp311-cp311-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7f31ec47b3ebd6a48fd915b532bb18f4cb14b3d74986c29ad0982356a6067feb",
"md5": "95ed7791dfb3671257e75d9280dda9e5",
"sha256": "f376c442b15ce19b8c7062666271b86ae0657a4148b8ef3c8777c6df435b9fb8"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp312-cp312-macosx_13_0_x86_64.whl",
"has_sig": false,
"md5_digest": "95ed7791dfb3671257e75d9280dda9e5",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": "<4.0,>=3.9",
"size": 5814746,
"upload_time": "2025-01-20T05:56:10",
"upload_time_iso_8601": "2025-01-20T05:56:10.579322Z",
"url": "https://files.pythonhosted.org/packages/7f/31/ec47b3ebd6a48fd915b532bb18f4cb14b3d74986c29ad0982356a6067feb/gptcomet-0.4.0-cp312-cp312-macosx_13_0_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "98788d4b0a02986fe395ffc6674792ac8936858b0a6a272ebc89d58220224bf7",
"md5": "9c54e015ba07479f06fc07c61d46bc41",
"sha256": "e1f09fe275fb0218f2cdf131d8a768ba54aed29cf9d83f17fc6ef257a2eb0b49"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp312-cp312-macosx_14_0_arm64.whl",
"has_sig": false,
"md5_digest": "9c54e015ba07479f06fc07c61d46bc41",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": "<4.0,>=3.9",
"size": 5445183,
"upload_time": "2025-01-20T05:56:13",
"upload_time_iso_8601": "2025-01-20T05:56:13.315439Z",
"url": "https://files.pythonhosted.org/packages/98/78/8d4b0a02986fe395ffc6674792ac8936858b0a6a272ebc89d58220224bf7/gptcomet-0.4.0-cp312-cp312-macosx_14_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5b79b38c1c3e0b1408b4d2ba0d77c36e17d64a1c7f6c6299c19ea884a5ea0e39",
"md5": "9cb3b75c05131e798e11c25c17f7b8ca",
"sha256": "ef705c445f98c2a1b3efcbe8fd329fa41284255b82dcb5f83d4860ae7876585d"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp312-cp312-manylinux_2_31_x86_64.whl",
"has_sig": false,
"md5_digest": "9cb3b75c05131e798e11c25c17f7b8ca",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": "<4.0,>=3.9",
"size": 5726721,
"upload_time": "2025-01-20T05:56:16",
"upload_time_iso_8601": "2025-01-20T05:56:16.037847Z",
"url": "https://files.pythonhosted.org/packages/5b/79/b38c1c3e0b1408b4d2ba0d77c36e17d64a1c7f6c6299c19ea884a5ea0e39/gptcomet-0.4.0-cp312-cp312-manylinux_2_31_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "694145679115a07267d5af2f79b210a8f6018438c4640e9f3c5b5945558f62fd",
"md5": "67b66696b9c941d5da00a95e8cbb27f5",
"sha256": "b36a4edc8c7dd52c3a5bd68ff3ea76685e1b4276af5c80c94485be2b367ce783"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp312-cp312-win_amd64.whl",
"has_sig": false,
"md5_digest": "67b66696b9c941d5da00a95e8cbb27f5",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": "<4.0,>=3.9",
"size": 5870884,
"upload_time": "2025-01-20T05:56:18",
"upload_time_iso_8601": "2025-01-20T05:56:18.622512Z",
"url": "https://files.pythonhosted.org/packages/69/41/45679115a07267d5af2f79b210a8f6018438c4640e9f3c5b5945558f62fd/gptcomet-0.4.0-cp312-cp312-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "3c5c1c5d77c61959744db881cae4dc54d876805e892306fa86f12433b0450eb7",
"md5": "1dd6cb27ed1cd2ce68653871f3715a37",
"sha256": "7e0b3ed1704f97264196413ab3716167d5285373aa0008bffb67620be0913d43"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp313-cp313-macosx_13_0_x86_64.whl",
"has_sig": false,
"md5_digest": "1dd6cb27ed1cd2ce68653871f3715a37",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": "<4.0,>=3.9",
"size": 5814745,
"upload_time": "2025-01-20T05:56:20",
"upload_time_iso_8601": "2025-01-20T05:56:20.379838Z",
"url": "https://files.pythonhosted.org/packages/3c/5c/1c5d77c61959744db881cae4dc54d876805e892306fa86f12433b0450eb7/gptcomet-0.4.0-cp313-cp313-macosx_13_0_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5d10223b744add169a08e4c144b4bb62bf20734c2a908e32dd5fe784911a26bf",
"md5": "b78444d5563863507f37aadb26cfc4ad",
"sha256": "6f50063e32b7a9566b4c8daa6783999c32a260500738a48b3bd1b0f2f82d9734"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp313-cp313-macosx_14_0_arm64.whl",
"has_sig": false,
"md5_digest": "b78444d5563863507f37aadb26cfc4ad",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": "<4.0,>=3.9",
"size": 5445182,
"upload_time": "2025-01-20T05:56:23",
"upload_time_iso_8601": "2025-01-20T05:56:23.190181Z",
"url": "https://files.pythonhosted.org/packages/5d/10/223b744add169a08e4c144b4bb62bf20734c2a908e32dd5fe784911a26bf/gptcomet-0.4.0-cp313-cp313-macosx_14_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "2633c0b52f52316b2111b95f7efc8fbdafae385c0fa1b983885109f60abfc053",
"md5": "9e2a3b761206be30c0c3544fa46e788b",
"sha256": "72351acf3e1d80f0c21c69aaeec0a63b04a9c6d5d0d97dedb4eb324acc4e06bb"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp313-cp313-manylinux_2_31_x86_64.whl",
"has_sig": false,
"md5_digest": "9e2a3b761206be30c0c3544fa46e788b",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": "<4.0,>=3.9",
"size": 5726721,
"upload_time": "2025-01-20T05:56:25",
"upload_time_iso_8601": "2025-01-20T05:56:25.885671Z",
"url": "https://files.pythonhosted.org/packages/26/33/c0b52f52316b2111b95f7efc8fbdafae385c0fa1b983885109f60abfc053/gptcomet-0.4.0-cp313-cp313-manylinux_2_31_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "3ddfc89ee5f34b8b295b4f7d39cdd98e02fe70a455129877c084d9b5d51d943a",
"md5": "05371648b5d688d41d32437d5837509c",
"sha256": "0c073afd42418059f807b1d760007569a58e5834c9705ee0a32b3ce3b50714f5"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp313-cp313-win_amd64.whl",
"has_sig": false,
"md5_digest": "05371648b5d688d41d32437d5837509c",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": "<4.0,>=3.9",
"size": 5870884,
"upload_time": "2025-01-20T05:56:30",
"upload_time_iso_8601": "2025-01-20T05:56:30.282776Z",
"url": "https://files.pythonhosted.org/packages/3d/df/c89ee5f34b8b295b4f7d39cdd98e02fe70a455129877c084d9b5d51d943a/gptcomet-0.4.0-cp313-cp313-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "83d681e0d02633282ad38f2b38d1e1c3cccefc3062e61d9977da5d9d715ed612",
"md5": "d16caf12dc85161a47d582dd65e1df1b",
"sha256": "b5eb7bfb2a309c2c3d99b8e255835c9724975b8795c59c0d56311282dda8c1d4"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp39-cp39-macosx_13_0_x86_64.whl",
"has_sig": false,
"md5_digest": "d16caf12dc85161a47d582dd65e1df1b",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": "<4.0,>=3.9",
"size": 5814744,
"upload_time": "2025-01-20T05:56:33",
"upload_time_iso_8601": "2025-01-20T05:56:33.647697Z",
"url": "https://files.pythonhosted.org/packages/83/d6/81e0d02633282ad38f2b38d1e1c3cccefc3062e61d9977da5d9d715ed612/gptcomet-0.4.0-cp39-cp39-macosx_13_0_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "0f9d66ecfd6860e68731b783c4a7e87d08261eb286646d3e70ece0b1378bf119",
"md5": "cd751cfc0e28ca051d4544219f7e213a",
"sha256": "34fb7f317045ec5931ba1470dc03360fcef5d164120b402773400dab92921532"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp39-cp39-macosx_14_0_arm64.whl",
"has_sig": false,
"md5_digest": "cd751cfc0e28ca051d4544219f7e213a",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": "<4.0,>=3.9",
"size": 5445180,
"upload_time": "2025-01-20T05:56:36",
"upload_time_iso_8601": "2025-01-20T05:56:36.276803Z",
"url": "https://files.pythonhosted.org/packages/0f/9d/66ecfd6860e68731b783c4a7e87d08261eb286646d3e70ece0b1378bf119/gptcomet-0.4.0-cp39-cp39-macosx_14_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "75b4ec2711f1e32e494266bdf1ee2df0e9cb226a9c31b1542a0e687830006867",
"md5": "27611f2246c734d2a35d6a498aee260d",
"sha256": "98ee63071ff48d58e4b2034d2b3c79a29c1b12f9c34116536592e3bb21b26598"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp39-cp39-manylinux_2_31_x86_64.whl",
"has_sig": false,
"md5_digest": "27611f2246c734d2a35d6a498aee260d",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": "<4.0,>=3.9",
"size": 5726719,
"upload_time": "2025-01-20T05:56:39",
"upload_time_iso_8601": "2025-01-20T05:56:39.137543Z",
"url": "https://files.pythonhosted.org/packages/75/b4/ec2711f1e32e494266bdf1ee2df0e9cb226a9c31b1542a0e687830006867/gptcomet-0.4.0-cp39-cp39-manylinux_2_31_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "030f9b22b7621d0da63cd368672ba22e5d141ace2dc25cc8f4ca32bbc00c27d6",
"md5": "94e20cb70d0793e6675e7faead894284",
"sha256": "a732deab3142fa46fbc4036eeca18d6251c95952f62136f00a51431a18b76915"
},
"downloads": -1,
"filename": "gptcomet-0.4.0-cp39-cp39-win_amd64.whl",
"has_sig": false,
"md5_digest": "94e20cb70d0793e6675e7faead894284",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": "<4.0,>=3.9",
"size": 5870885,
"upload_time": "2025-01-20T05:56:42",
"upload_time_iso_8601": "2025-01-20T05:56:42.440772Z",
"url": "https://files.pythonhosted.org/packages/03/0f/9b22b7621d0da63cd368672ba22e5d141ace2dc25cc8f4ca32bbc00c27d6/gptcomet-0.4.0-cp39-cp39-win_amd64.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-20 05:55:47",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "belingud",
"github_project": "gptcomet",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "gptcomet"
}