gpt-po-translator


Namegpt-po-translator JSON
Version 1.3.0 PyPI version JSON
download
home_pageNone
SummaryA CLI tool for translating .po files using GPT models.
upload_time2025-10-08 14:40:36
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # GPT-PO Translator

[![Python Package CI](https://github.com/pescheckit/python-gpt-po/actions/workflows/ci-cd.yml/badge.svg)](https://github.com/pescheckit/python-gpt-po/actions/workflows/ci-cd.yml)
![PyPI](https://img.shields.io/pypi/v/gpt-po-translator?label=gpt-po-translator)
![Downloads](https://pepy.tech/badge/gpt-po-translator)

**Translate gettext (.po) files using AI models.** Supports OpenAI, Azure OpenAI, Anthropic/Claude, DeepSeek, and Ollama (local) with automatic AI translation tagging.

## 🚀 Quick Start

```bash
# Install
pip install gpt-po-translator

# Set API key
export OPENAI_API_KEY='your_api_key_here'

# Auto-detect and translate all languages
gpt-po-translator --folder ./locales --bulk
```

## ✨ Key Features

- **Multiple AI providers** - OpenAI, Azure OpenAI, Anthropic/Claude, DeepSeek, and Ollama (local)
- **Privacy option** - Use Ollama for local, offline translations with no cloud API
- **AI translation tracking** - Auto-tags AI-generated translations with `#. AI-generated` comments
- **Bulk processing** - Efficient batch translation for large files
- **Smart language detection** - Auto-detects target languages from folder structure
- **Fuzzy entry handling** - Translates and fixes fuzzy entries properly
- **Docker ready** - Available as container for easy deployment

## 📦 Installation

### PyPI (Recommended)
```bash
pip install gpt-po-translator
```

### Docker
```bash
docker pull ghcr.io/pescheckit/python-gpt-po:latest
```

### Manual
```bash
git clone https://github.com/pescheckit/python-gpt-po.git
cd python-gpt-po
pip install -e .
```

## 🔧 Setup

### API Keys (Cloud Providers)

Choose your AI provider and set the corresponding API key:

```bash
# OpenAI
export OPENAI_API_KEY='your_key'

# Anthropic/Claude
export ANTHROPIC_API_KEY='your_key'

# DeepSeek
export DEEPSEEK_API_KEY='your_key'

# Azure OpenAI
export AZURE_OPENAI_API_KEY='your_key'
export AZURE_OPENAI_ENDPOINT='https://your-resource.openai.azure.com/'
export AZURE_OPENAI_API_VERSION='2024-02-01'
```

### Or Use Ollama (Local, No API Key Needed)

```bash
# 1. Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# 2. Pull a model
ollama pull qwen2.5    # Best for multilingual (Arabic, Chinese, etc.)
# OR
ollama pull llama3.2   # Fast for European languages

# 3. Translate (no API key required!)
gpt-po-translator --provider ollama --folder ./locales

# For non-Latin scripts, use qwen2.5 WITHOUT --bulk
gpt-po-translator --provider ollama --model qwen2.5 --folder ./locales --lang ar
```

> **💡 Important:** For Ollama with **non-Latin languages** (Arabic, Chinese, Japanese, etc.), **omit the `--bulk` flag**. Single-item translation is more reliable because the model doesn't have to format responses as JSON.

## 💡 Usage Examples

### Basic Translation
```bash
# Auto-detect languages from PO files (recommended)
gpt-po-translator --folder ./locales --bulk -v

# Or specify languages explicitly
gpt-po-translator --folder ./locales --lang de,fr,es --bulk -v

# Single language with progress information
gpt-po-translator --folder ./locales --lang de -v
```

### Different AI Providers
```bash
# Use Claude (Anthropic) - auto-detect languages
gpt-po-translator --provider anthropic --folder ./locales --bulk

# Use DeepSeek with specific languages
gpt-po-translator --provider deepseek --folder ./locales --lang de

# Use Azure OpenAI with auto-detection
gpt-po-translator --provider azure_openai --folder ./locales --bulk

# Use Ollama (local, private, free) - omit --bulk for non-Latin scripts
gpt-po-translator --provider ollama --folder ./locales
```

### Docker Usage
```bash
# Basic usage with OpenAI
docker run -v $(pwd):/data \
  -e OPENAI_API_KEY="your_key" \
  ghcr.io/pescheckit/python-gpt-po:latest \
  --folder /data --bulk

# With Ollama (local, no API key needed)
# Note: Omit --bulk for better quality with non-Latin scripts
docker run --rm \
  -v $(pwd):/data \
  --network host \
  ghcr.io/pescheckit/python-gpt-po:latest \
  --provider ollama \
  --folder /data

# With Azure OpenAI
docker run -v $(pwd):/data \
  -e AZURE_OPENAI_API_KEY="your_key" \
  -e AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/" \
  -e AZURE_OPENAI_API_VERSION="2024-02-01" \
  ghcr.io/pescheckit/python-gpt-po:latest \
  --provider azure_openai --folder /data --lang de
```

## 🏷️ AI Translation Tracking

**All AI translations are automatically tagged** for transparency and compliance:

```po
#. AI-generated
msgid "Hello"
msgstr "Hallo"
```

This helps you:
- Track which translations are AI vs human-generated
- Comply with AI content disclosure requirements
- Manage incremental translation workflows

**Note:** Django's `makemessages` removes these comments but preserves translations. Re-run the translator after `makemessages` to restore tags.

## 📚 Command Reference

| Option | Description |
|--------|-------------|
| `--folder` | Path to .po files |
| `--lang` | Target languages (e.g., `de,fr,es`, `fr_CA`, `pt_BR`) |
| `--provider` | AI provider: `openai`, `azure_openai`, `anthropic`, `deepseek`, `ollama` |
| `--bulk` | Enable batch translation (recommended for large files) |
| `--bulksize` | Entries per batch (default: 50) |
| `--model` | Specific model to use |
| `--list-models` | Show available models |
| `--fix-fuzzy` | Translate fuzzy entries |
| `--folder-language` | Auto-detect languages from folders |
| `--no-ai-comment` | Disable AI tagging |
| `--ollama-base-url` | Ollama server URL (default: `http://localhost:11434`) |
| `--ollama-timeout` | Ollama timeout in seconds (default: 120) |
| `-v, --verbose` | Show progress information (use `-vv` for debug) |
| `-q, --quiet` | Only show errors |
| `--version` | Show version and exit |

## 🛠️ Development

### Build Docker Locally
```bash
git clone https://github.com/pescheckit/python-gpt-po.git
cd python-gpt-po
docker build -t python-gpt-po .
```

### Run Tests
```bash
# Local
python -m pytest

# Docker
docker run --rm -v $(pwd):/app -w /app --entrypoint python python-gpt-po -m pytest -v
```

## 📋 Requirements

- Python 3.9+
- Dependencies: `polib`, `openai`, `anthropic`, `requests`, `tenacity`

## 📖 Documentation

- **[Advanced Usage Guide](docs/usage.md)** - Comprehensive options and mechanics
- **[Development Guide](docs/development.md)** - Contributing guidelines
- **[GitHub Issues](https://github.com/pescheckit/python-gpt-po/issues)** - Bug reports and feature requests

## 📄 License

MIT License - See [LICENSE](LICENSE) for details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "gpt-po-translator",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "Bram Mittendorff <bram@pescheck.io>",
    "download_url": "https://files.pythonhosted.org/packages/67/90/d45ed88b938c5abdaeba746e6cae66a4101cca472b098ab5f3cad1b81df4/gpt_po_translator-1.3.0.tar.gz",
    "platform": null,
    "description": "# GPT-PO Translator\n\n[![Python Package CI](https://github.com/pescheckit/python-gpt-po/actions/workflows/ci-cd.yml/badge.svg)](https://github.com/pescheckit/python-gpt-po/actions/workflows/ci-cd.yml)\n![PyPI](https://img.shields.io/pypi/v/gpt-po-translator?label=gpt-po-translator)\n![Downloads](https://pepy.tech/badge/gpt-po-translator)\n\n**Translate gettext (.po) files using AI models.** Supports OpenAI, Azure OpenAI, Anthropic/Claude, DeepSeek, and Ollama (local) with automatic AI translation tagging.\n\n## \ud83d\ude80 Quick Start\n\n```bash\n# Install\npip install gpt-po-translator\n\n# Set API key\nexport OPENAI_API_KEY='your_api_key_here'\n\n# Auto-detect and translate all languages\ngpt-po-translator --folder ./locales --bulk\n```\n\n## \u2728 Key Features\n\n- **Multiple AI providers** - OpenAI, Azure OpenAI, Anthropic/Claude, DeepSeek, and Ollama (local)\n- **Privacy option** - Use Ollama for local, offline translations with no cloud API\n- **AI translation tracking** - Auto-tags AI-generated translations with `#. AI-generated` comments\n- **Bulk processing** - Efficient batch translation for large files\n- **Smart language detection** - Auto-detects target languages from folder structure\n- **Fuzzy entry handling** - Translates and fixes fuzzy entries properly\n- **Docker ready** - Available as container for easy deployment\n\n## \ud83d\udce6 Installation\n\n### PyPI (Recommended)\n```bash\npip install gpt-po-translator\n```\n\n### Docker\n```bash\ndocker pull ghcr.io/pescheckit/python-gpt-po:latest\n```\n\n### Manual\n```bash\ngit clone https://github.com/pescheckit/python-gpt-po.git\ncd python-gpt-po\npip install -e .\n```\n\n## \ud83d\udd27 Setup\n\n### API Keys (Cloud Providers)\n\nChoose your AI provider and set the corresponding API key:\n\n```bash\n# OpenAI\nexport OPENAI_API_KEY='your_key'\n\n# Anthropic/Claude\nexport ANTHROPIC_API_KEY='your_key'\n\n# DeepSeek\nexport DEEPSEEK_API_KEY='your_key'\n\n# Azure OpenAI\nexport AZURE_OPENAI_API_KEY='your_key'\nexport AZURE_OPENAI_ENDPOINT='https://your-resource.openai.azure.com/'\nexport AZURE_OPENAI_API_VERSION='2024-02-01'\n```\n\n### Or Use Ollama (Local, No API Key Needed)\n\n```bash\n# 1. Install Ollama\ncurl -fsSL https://ollama.com/install.sh | sh\n\n# 2. Pull a model\nollama pull qwen2.5    # Best for multilingual (Arabic, Chinese, etc.)\n# OR\nollama pull llama3.2   # Fast for European languages\n\n# 3. Translate (no API key required!)\ngpt-po-translator --provider ollama --folder ./locales\n\n# For non-Latin scripts, use qwen2.5 WITHOUT --bulk\ngpt-po-translator --provider ollama --model qwen2.5 --folder ./locales --lang ar\n```\n\n> **\ud83d\udca1 Important:** For Ollama with **non-Latin languages** (Arabic, Chinese, Japanese, etc.), **omit the `--bulk` flag**. Single-item translation is more reliable because the model doesn't have to format responses as JSON.\n\n## \ud83d\udca1 Usage Examples\n\n### Basic Translation\n```bash\n# Auto-detect languages from PO files (recommended)\ngpt-po-translator --folder ./locales --bulk -v\n\n# Or specify languages explicitly\ngpt-po-translator --folder ./locales --lang de,fr,es --bulk -v\n\n# Single language with progress information\ngpt-po-translator --folder ./locales --lang de -v\n```\n\n### Different AI Providers\n```bash\n# Use Claude (Anthropic) - auto-detect languages\ngpt-po-translator --provider anthropic --folder ./locales --bulk\n\n# Use DeepSeek with specific languages\ngpt-po-translator --provider deepseek --folder ./locales --lang de\n\n# Use Azure OpenAI with auto-detection\ngpt-po-translator --provider azure_openai --folder ./locales --bulk\n\n# Use Ollama (local, private, free) - omit --bulk for non-Latin scripts\ngpt-po-translator --provider ollama --folder ./locales\n```\n\n### Docker Usage\n```bash\n# Basic usage with OpenAI\ndocker run -v $(pwd):/data \\\n  -e OPENAI_API_KEY=\"your_key\" \\\n  ghcr.io/pescheckit/python-gpt-po:latest \\\n  --folder /data --bulk\n\n# With Ollama (local, no API key needed)\n# Note: Omit --bulk for better quality with non-Latin scripts\ndocker run --rm \\\n  -v $(pwd):/data \\\n  --network host \\\n  ghcr.io/pescheckit/python-gpt-po:latest \\\n  --provider ollama \\\n  --folder /data\n\n# With Azure OpenAI\ndocker run -v $(pwd):/data \\\n  -e AZURE_OPENAI_API_KEY=\"your_key\" \\\n  -e AZURE_OPENAI_ENDPOINT=\"https://your-resource.openai.azure.com/\" \\\n  -e AZURE_OPENAI_API_VERSION=\"2024-02-01\" \\\n  ghcr.io/pescheckit/python-gpt-po:latest \\\n  --provider azure_openai --folder /data --lang de\n```\n\n## \ud83c\udff7\ufe0f AI Translation Tracking\n\n**All AI translations are automatically tagged** for transparency and compliance:\n\n```po\n#. AI-generated\nmsgid \"Hello\"\nmsgstr \"Hallo\"\n```\n\nThis helps you:\n- Track which translations are AI vs human-generated\n- Comply with AI content disclosure requirements\n- Manage incremental translation workflows\n\n**Note:** Django's `makemessages` removes these comments but preserves translations. Re-run the translator after `makemessages` to restore tags.\n\n## \ud83d\udcda Command Reference\n\n| Option | Description |\n|--------|-------------|\n| `--folder` | Path to .po files |\n| `--lang` | Target languages (e.g., `de,fr,es`, `fr_CA`, `pt_BR`) |\n| `--provider` | AI provider: `openai`, `azure_openai`, `anthropic`, `deepseek`, `ollama` |\n| `--bulk` | Enable batch translation (recommended for large files) |\n| `--bulksize` | Entries per batch (default: 50) |\n| `--model` | Specific model to use |\n| `--list-models` | Show available models |\n| `--fix-fuzzy` | Translate fuzzy entries |\n| `--folder-language` | Auto-detect languages from folders |\n| `--no-ai-comment` | Disable AI tagging |\n| `--ollama-base-url` | Ollama server URL (default: `http://localhost:11434`) |\n| `--ollama-timeout` | Ollama timeout in seconds (default: 120) |\n| `-v, --verbose` | Show progress information (use `-vv` for debug) |\n| `-q, --quiet` | Only show errors |\n| `--version` | Show version and exit |\n\n## \ud83d\udee0\ufe0f Development\n\n### Build Docker Locally\n```bash\ngit clone https://github.com/pescheckit/python-gpt-po.git\ncd python-gpt-po\ndocker build -t python-gpt-po .\n```\n\n### Run Tests\n```bash\n# Local\npython -m pytest\n\n# Docker\ndocker run --rm -v $(pwd):/app -w /app --entrypoint python python-gpt-po -m pytest -v\n```\n\n## \ud83d\udccb Requirements\n\n- Python 3.9+\n- Dependencies: `polib`, `openai`, `anthropic`, `requests`, `tenacity`\n\n## \ud83d\udcd6 Documentation\n\n- **[Advanced Usage Guide](docs/usage.md)** - Comprehensive options and mechanics\n- **[Development Guide](docs/development.md)** - Contributing guidelines\n- **[GitHub Issues](https://github.com/pescheckit/python-gpt-po/issues)** - Bug reports and feature requests\n\n## \ud83d\udcc4 License\n\nMIT License - See [LICENSE](LICENSE) for details.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A CLI tool for translating .po files using GPT models.",
    "version": "1.3.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "9b2d47917f5310a088b973c5f47aed5ea2cd93f6ef21dad2a688217e249a80a8",
                "md5": "6f98410753a74c8c7dd513fcc2e4b7d4",
                "sha256": "72b7e45ac6fd66e4f9a8c50c4d3df94deb617debc41dedb81057ded7778e9d71"
            },
            "downloads": -1,
            "filename": "gpt_po_translator-1.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6f98410753a74c8c7dd513fcc2e4b7d4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 110868,
            "upload_time": "2025-10-08T14:40:35",
            "upload_time_iso_8601": "2025-10-08T14:40:35.648159Z",
            "url": "https://files.pythonhosted.org/packages/9b/2d/47917f5310a088b973c5f47aed5ea2cd93f6ef21dad2a688217e249a80a8/gpt_po_translator-1.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "6790d45ed88b938c5abdaeba746e6cae66a4101cca472b098ab5f3cad1b81df4",
                "md5": "eb958e5c7379635a003c00cd2c07e8f7",
                "sha256": "55a5535148dab0105d95379350e18aaed66ca7d180004f0ab4c421c165b35e82"
            },
            "downloads": -1,
            "filename": "gpt_po_translator-1.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "eb958e5c7379635a003c00cd2c07e8f7",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 95177,
            "upload_time": "2025-10-08T14:40:36",
            "upload_time_iso_8601": "2025-10-08T14:40:36.693666Z",
            "url": "https://files.pythonhosted.org/packages/67/90/d45ed88b938c5abdaeba746e6cae66a4101cca472b098ab5f3cad1b81df4/gpt_po_translator-1.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-08 14:40:36",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "gpt-po-translator"
}
        
Elapsed time: 2.77598s