# claif_gem - Gemini Provider for Claif
A Claif provider for Google Gemini with full OpenAI client API compatibility. This package wraps the `gemini-cli` command-line tool to provide a consistent interface following the `client.chat.completions.create()` pattern.
## Features
- **OpenAI Client API Compatible**: Use the familiar `client.chat.completions.create()` pattern
- **Full Type Safety**: Returns standard `ChatCompletion` and `ChatCompletionChunk` objects
- **Streaming Support**: Real-time streaming with proper chunk handling
- **Subprocess Management**: Reliable communication with Gemini CLI
- **Auto-approval Mode**: Streamlined workflows without interruptions
- **Cross-platform Support**: Works on Windows, macOS, and Linux
- **Fire-based CLI**: Rich terminal interface with multiple output formats
## Quickstart
```bash
# Install
pip install claif_gem
# Basic usage - OpenAI compatible
python -c "
from claif_gem import GeminiClient
client = GeminiClient()
response = client.chat.completions.create(
messages=[{'role': 'user', 'content': 'Hello Gemini!'}],
model='gemini-1.5-flash'
)
print(response.choices[0].message.content)
"
# CLI usage
claif-gem query "Explain quantum computing"
claif-gem chat --model gemini-1.5-pro
```
## What is claif_gem?
`claif_gem` is the Google Gemini provider for the Claif framework with full OpenAI client API compatibility. It wraps the [Gemini CLI](https://github.com/google-gemini/gemini-cli/) tool to integrate Google's powerful Gemini language models into the unified Claif ecosystem through subprocess management and clean message translation.
**Key Features:**
- **Subprocess-based integration** - Reliable communication with Gemini CLI
- **Auto-approve & yes-mode** - Streamlined workflows without interruptions
- **Cross-platform CLI discovery** - Works on Windows, macOS, and Linux
- **Async/await throughout** - Built on anyio for efficiency
- **Rich CLI interface** - Beautiful terminal output with Fire
- **Type-safe API** - Comprehensive type hints for IDE support
- **Robust error handling** - Timeout protection and graceful failures
## Installation
### Prerequisites
Install the Gemini CLI via npm:
```bash
npm install -g @google/gemini-cli
```
Or set the path to an existing installation:
```bash
export GEMINI_CLI_PATH=/path/to/gemini
```
### Basic Installation
```bash
# Core package only
pip install claif_gem
# With Claif framework
pip install claif claif_gem
# All Claif providers
pip install claif[all]
```
### Installing Gemini CLI with Claif
```bash
# Using Claif's installer (recommended)
pip install claif && claif install gemini
# Or using claif_gem's installer
python -m claif_gem.install
# Manual installation with bun (faster)
bun add -g @google/gemini-cli
```
### Development Installation
```bash
git clone https://github.com/twardoch/claif_gem.git
cd claif_gem
pip install -e ".[dev,test]"
```
## Usage
### Basic Usage (OpenAI-Compatible)
```python
from claif_gem import GeminiClient
# Initialize the client
client = GeminiClient(
api_key="your-api-key", # Optional, uses GEMINI_API_KEY env var
cli_path="/path/to/gemini" # Optional, auto-discovers
)
# Create a chat completion - exactly like OpenAI
response = client.chat.completions.create(
model="gemini-1.5-flash",
messages=[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Explain machine learning"}
],
temperature=0.7,
max_tokens=1000
)
# Access the response
print(response.choices[0].message.content)
print(f"Model: {response.model}")
print(f"Usage: {response.usage}")
```
### Streaming Responses
```python
from claif_gem import GeminiClient
client = GeminiClient()
# Stream responses in real-time
stream = client.chat.completions.create(
model="gemini-1.5-pro",
messages=[
{"role": "user", "content": "Write a story about space exploration"}
],
stream=True
)
# Process streaming chunks
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)
```
## CLI Usage
```bash
# Basic query
claif-gem query "Explain machine learning"
# With specific model
claif-gem query "Write Python code for binary search" --model gemini-1.5-pro
# Interactive chat mode
claif-gem chat --model gemini-2.0-flash-exp
# With system prompt
claif-gem query "Translate to French" --system "You are a professional translator"
# Stream responses
claif-gem stream "Create a detailed tutorial on REST APIs"
# Health check
claif-gem health
# List models
claif-gem models
# Show configuration
claif-gem config show
```
### Advanced Options
```bash
# Control tool approval
claif-gem query "Process these files" --auto-approve # Auto-approve tool use
claif-gem query "Analyze code" --no-auto-approve # Manual approval
# Yes mode for all prompts
claif-gem query "Refactor this module" --yes-mode
# Verbose output for debugging
claif-gem query "Debug this error" --verbose
# Custom timeout
claif-gem query "Complex analysis" --timeout 300
# Show response metrics
claif-gem query "Quick question" --show-metrics
```
### Configuration Management
```bash
# Show current config
claif-gem config show
# Set values
claif-gem config set --default-model gemini-2.5-pro
claif-gem config set --auto-approve true
claif-gem config set --timeout 180
# Save configuration
claif-gem config save
```
## Python API Usage
### Basic Usage
```python
import asyncio
from claif_gem import query, GeminiOptions
async def main():
# Simple query
async for message in query("Hello, Gemini!"):
print(message.content)
# Query with options
options = GeminiOptions(
model="gemini-2.5-pro",
temperature=0.7,
system_prompt="You are a helpful coding assistant",
auto_approve=True,
yes_mode=True
)
async for message in query("Explain Python decorators", options):
print(message.content)
asyncio.run(main())
```
### Direct Client Usage
```python
from claif_gem.client import GeminiClient
from claif_gem.types import GeminiOptions
async def use_client():
client = GeminiClient()
options = GeminiOptions(
model="gemini-2.5-pro",
verbose=True,
max_context_length=16000
)
async for message in client.query("What is machine learning?", options):
print(f"[{message.role}]: {message.content}")
asyncio.run(use_client())
```
### Transport Layer Access
```python
from claif_gem.transport import GeminiTransport
from claif_gem.types import GeminiOptions
async def direct_transport():
transport = GeminiTransport()
options = GeminiOptions(
timeout=120,
auto_approve=True
)
async for response in transport.send_query("Explain async programming", options):
if hasattr(response, 'content'):
print(response.content)
asyncio.run(direct_transport())
```
### Error Handling
```python
from claif.common import ProviderError, TransportError
from claif_gem import query, GeminiOptions
async def safe_query():
try:
options = GeminiOptions(timeout=60)
async for message in query("Complex task", options):
print(message.content)
except TransportError as e:
print(f"Transport error: {e}")
# Retry with different settings
except ProviderError as e:
print(f"Gemini error: {e}")
except Exception as e:
print(f"Unexpected error: {e}")
asyncio.run(safe_query())
```
### Using with Claif Framework
```python
from claif import query as claif_query, Provider, ClaifOptions
async def use_with_claif():
options = ClaifOptions(
provider=Provider.GEMINI,
model="gemini-2.5-pro",
temperature=0.5,
system_prompt="You are a data science expert"
)
async for message in claif_query("Explain neural networks", options):
print(message.content)
asyncio.run(use_with_claif())
```
## How It Works
### Architecture Overview
```
┌─────────────────────┐
│ User Application │
├─────────────────────┤
│ Claif Core │
├─────────────────────┤
│ claif_gem │
│ ┌───────────────┐ │
│ │ __init__.py │ │ ← Main entry point, Claif interface
│ ├───────────────┤ │
│ │ cli.py │ │ ← Fire-based CLI commands
│ ├───────────────┤ │
│ │ client.py │ │ ← Client orchestration
│ ├───────────────┤ │
│ │ transport.py │ │ ← Subprocess management
│ ├───────────────┤ │
│ │ types.py │ │ ← Type definitions
│ └───────────────┘ │
├─────────────────────┤
│ Subprocess Layer │
├─────────────────────┤
│ Gemini CLI Binary │ ← External Node.js CLI
└─────────────────────┘
```
### Core Components
#### Main Module (`__init__.py`)
Entry point providing the `query()` function:
```python
async def query(
prompt: str,
options: ClaifOptions | None = None
) -> AsyncIterator[Message]:
"""Query Gemini with Claif-compatible interface."""
# Convert options
gemini_options = _convert_options(options) if options else GeminiOptions()
# Delegate to client
async for message in _client.query(prompt, gemini_options):
yield message
```
Features:
- Thin wrapper design
- Option conversion between Claif and Gemini formats
- Module-level client instance
- Clean async generator interface
#### CLI Module (`cli.py`)
Fire-based command-line interface:
```python
class GeminiCLI:
def query(self, prompt: str, **kwargs):
"""Execute a query to Gemini."""
def stream(self, prompt: str, **kwargs):
"""Stream responses in real-time."""
def health(self):
"""Check Gemini CLI availability."""
def config(self, action: str = "show", **kwargs):
"""Manage configuration."""
```
Features:
- Rich console output with progress indicators
- Response formatting and metrics
- Async execution with error handling
- Configuration persistence
#### Client Module (`client.py`)
Manages the query lifecycle:
```python
class GeminiClient:
def __init__(self):
self.transport = GeminiTransport()
async def query(self, prompt: str, options: GeminiOptions):
# Send query via transport
async for gemini_msg in self.transport.send_query(prompt, options):
# Convert to Claif format
yield self._convert_message(gemini_msg)
```
Features:
- Transport lifecycle management
- Message format conversion
- Error propagation
- Clean separation of concerns
#### Transport Module (`transport.py`)
Handles subprocess communication:
```python
class GeminiTransport:
async def send_query(self, prompt: str, options: GeminiOptions):
# Find CLI
cli_path = self._find_cli_path()
# Build command
cmd = self._build_command(cli_path, prompt, options)
# Execute and stream
async with await anyio.open_process(cmd) as proc:
async for line in proc.stdout:
yield self._parse_line(line)
```
Key methods:
- `_find_cli_path()` - Multi-location CLI discovery
- `_build_command()` - Safe argument construction
- `_parse_output_line()` - JSON and plain text parsing
- Timeout management with process cleanup
#### Types Module (`types.py`)
Type definitions and conversions:
```python
@dataclass
class GeminiOptions:
model: str | None = None
temperature: float | None = None
system_prompt: str | None = None
auto_approve: bool = True
yes_mode: bool = True
max_context_length: int | None = None
timeout: int | None = None
verbose: bool = False
@dataclass
class GeminiMessage:
role: str
content: str
metadata: dict[str, Any] | None = None
def to_claif_message(self) -> Message:
"""Convert to Claif format."""
```
### Message Flow
1. **User Input** → CLI command or API call
2. **Option Translation** → ClaifOptions → GeminiOptions
3. **Client Processing** → GeminiClient prepares query
4. **Transport Execution**:
- Find Gemini CLI binary
- Build command with arguments
- Spawn subprocess with anyio
- Read stdout/stderr streams
5. **Response Parsing**:
- Try JSON parsing first
- Fallback to plain text
- Convert to GeminiMessage
6. **Message Conversion** → GeminiMessage → Claif Message
7. **Async Yield** → Messages yielded to caller
### CLI Discovery
The transport searches for Gemini CLI in this order:
1. `GEMINI_CLI_PATH` environment variable
2. System PATH (`which gemini`)
3. Common installation paths:
- `~/.local/bin/gemini`
- `/usr/local/bin/gemini`
- `/opt/gemini/bin/gemini`
4. NPM global paths:
- Windows: `%APPDATA%/npm/gemini.cmd`
- Unix: `~/.npm-global/bin/gemini`
- System: `/usr/local/lib/node_modules/.bin/gemini`
### Command Construction
The Gemini CLI is invoked with arguments based on options:
```bash
gemini \
-m <model> \
-a # auto-approve
-y # yes-mode
-t <temp> # temperature
-s <prompt> # system prompt
--max-context <length> \
-p "user prompt"
```
### Code Structure
```
claif_gem/
├── src/claif_gem/
│ ├── __init__.py # Main entry point
│ ├── cli.py # Fire CLI implementation
│ ├── client.py # Client orchestration
│ ├── transport.py # Subprocess management
│ ├── types.py # Type definitions
│ └── install.py # CLI installation helper
├── tests/
│ └── test_package.py # Basic tests
├── pyproject.toml # Package configuration
├── README.md # This file
└── CLAUDE.md # Development guide
```
### Configuration
Environment variables:
- `GEMINI_CLI_PATH` - Path to Gemini CLI binary
- `GEMINI_SDK=1` - Set by transport to indicate SDK usage
- `CLAIF_PROVIDER=gemini` - Provider identification
Config file (`~/.claif/config.json`):
```json
{
"providers": {
"gemini": {
"model": "gemini-2.5-pro",
"auto_approve": true,
"yes_mode": true,
"max_context_length": 32000,
"timeout": 180
}
}
}
```
## Installation with Bun
For faster installation, use Bun:
```bash
# Install bun
curl -fsSL https://bun.sh/install | bash
# Install Gemini CLI
bun add -g @google/gemini-cli
# Or use Claif's bundled installer
pip install claif
claif install gemini # Uses bun internally
```
Benefits of Bun:
- 10x faster npm installs
- Creates standalone executables
- No Node.js version conflicts
- Cross-platform compatibility
## Why Use claif_gem?
### 1. **Unified Interface**
- Access Gemini through standard Claif API
- Switch between providers with one parameter
- Consistent error handling across providers
### 2. **Cross-Platform**
- Automatic CLI discovery on all platforms
- Platform-specific path handling
- Works in diverse environments
### 3. **Developer Experience**
- Full type hints for IDE support
- Rich CLI with progress indicators
- Clean async/await patterns
- Comprehensive error messages
### 4. **Production Ready**
- Robust subprocess management
- Timeout protection
- Graceful error recovery
- Extensive logging
### 5. **Flexible Configuration**
- Environment variables
- Config files
- CLI arguments
- Sensible defaults
## API Compatibility
This package is fully compatible with the OpenAI Python client API:
```python
# You can use it as a drop-in replacement
from claif_gem import GeminiClient as OpenAI
client = OpenAI()
# Now use exactly like the OpenAI client
response = client.chat.completions.create(
model="gemini-1.5-flash",
messages=[{"role": "user", "content": "Hello!"}]
)
```
## Migration from Old Async API
If you were using the old async-based Claif API:
```python
# Old API (deprecated)
import asyncio
from claif_gem import query
async def old_way():
async for message in query("Hello Gemini"):
print(message.content)
# New API (OpenAI-compatible)
from claif_gem import GeminiClient
def new_way():
client = GeminiClient()
response = client.chat.completions.create(
messages=[{"role": "user", "content": "Hello Gemini"}],
model="gemini-1.5-flash"
)
print(response.choices[0].message.content)
```
### Key Changes
1. **Synchronous by default**: No more `async/await` for basic usage
2. **OpenAI-compatible structure**: `client.chat.completions.create()` pattern
3. **Standard message format**: `[{"role": "user", "content": "..."}]`
4. **Streaming support**: Use `stream=True` for real-time responses
5. **Type-safe responses**: Returns `ChatCompletion` objects from OpenAI types
## Best Practices
1. **Use auto-approve for trusted operations** - Speeds up workflows
2. **Set appropriate timeouts** - Prevent hanging on complex queries
3. **Enable verbose mode for debugging** - See full subprocess communication
4. **Use system prompts** - Set context for better responses
5. **Configure max context length** - Based on your use case
6. **Handle errors gracefully** - Implement retry logic
7. **Use streaming for long responses** - Better user experience
## Contributing
See [CLAUDE.md](CLAUDE.md) for development guidelines.
### Development Setup
```bash
# Clone repository
git clone https://github.com/twardoch/claif_gem.git
cd claif_gem
# Install with dev dependencies
pip install -e ".[dev,test]"
# Run tests
pytest
# Format code
ruff format src/claif_gem tests
# Lint
ruff check src/claif_gem tests
# Type check
mypy src/claif_gem
```
### Testing
```bash
# Run all tests
pytest
# Run with coverage
pytest --cov=claif_gem --cov-report=html
# Run specific test
pytest tests/test_transport.py -v
# Test CLI commands
python -m claif_gem.cli health
python -m claif_gem.cli models
```
## License
MIT License - see [LICENSE](LICENSE) file for details.
Copyright (c) 2025 Adam Twardoch
## Links
### claif_gem Resources
- [GitHub Repository](https://github.com/twardoch/claif_gem) - Source code
- [PyPI Package](https://pypi.org/project/claif_gem/) - Latest release
- [Issue Tracker](https://github.com/twardoch/claif_gem/issues) - Bug reports
- [Discussions](https://github.com/twardoch/claif_gem/discussions) - Q&A
### Related Projects
**Claif Ecosystem:**
- [Claif](https://github.com/twardoch/claif) - Main framework
- [claif_cla](https://github.com/twardoch/claif_cla) - Claude provider
- [claif_cod](https://github.com/twardoch/claif_cod) - Codex provider
**Upstream Projects:**
- [Gemini CLI](https://github.com/google-gemini/gemini-cli/) - Google's CLI
- [Google AI Studio](https://ai.google.dev/) - Gemini documentation
- [Google AI Python SDK](https://github.com/google/generative-ai-python) - Python SDK
**Tools & Libraries:**
- [Fire](https://github.com/google/python-fire) - CLI framework
- [Rich](https://github.com/Textualize/rich) - Terminal formatting
- [anyio](https://github.com/agronholm/anyio) - Async compatibility
- [Bun](https://bun.sh) - Fast JavaScript runtime
Raw data
{
"_id": null,
"home_page": null,
"name": "claif_gem",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.12",
"maintainer_email": null,
"keywords": "ai, artificial-intelligence, claif, cli, command-line, gemini, google, llm, openai, responses-api",
"author": null,
"author_email": "Adam Twardoch <adam+github@twardoch.com>",
"download_url": "https://files.pythonhosted.org/packages/b0/b9/89ccefe7620c2d0664d5dce804b9badaab133a8cb115a029a115c5f2d331/claif_gem-1.0.29.tar.gz",
"platform": null,
"description": "# claif_gem - Gemini Provider for Claif\n\nA Claif provider for Google Gemini with full OpenAI client API compatibility. This package wraps the `gemini-cli` command-line tool to provide a consistent interface following the `client.chat.completions.create()` pattern.\n\n## Features\n\n- **OpenAI Client API Compatible**: Use the familiar `client.chat.completions.create()` pattern\n- **Full Type Safety**: Returns standard `ChatCompletion` and `ChatCompletionChunk` objects\n- **Streaming Support**: Real-time streaming with proper chunk handling \n- **Subprocess Management**: Reliable communication with Gemini CLI\n- **Auto-approval Mode**: Streamlined workflows without interruptions\n- **Cross-platform Support**: Works on Windows, macOS, and Linux\n- **Fire-based CLI**: Rich terminal interface with multiple output formats\n\n## Quickstart\n\n```bash\n# Install\npip install claif_gem\n\n# Basic usage - OpenAI compatible\npython -c \"\nfrom claif_gem import GeminiClient\nclient = GeminiClient()\nresponse = client.chat.completions.create(\n messages=[{'role': 'user', 'content': 'Hello Gemini!'}],\n model='gemini-1.5-flash'\n)\nprint(response.choices[0].message.content)\n\"\n\n# CLI usage\nclaif-gem query \"Explain quantum computing\"\nclaif-gem chat --model gemini-1.5-pro\n```\n\n## What is claif_gem?\n\n`claif_gem` is the Google Gemini provider for the Claif framework with full OpenAI client API compatibility. It wraps the [Gemini CLI](https://github.com/google-gemini/gemini-cli/) tool to integrate Google's powerful Gemini language models into the unified Claif ecosystem through subprocess management and clean message translation.\n\n**Key Features:**\n- **Subprocess-based integration** - Reliable communication with Gemini CLI\n- **Auto-approve & yes-mode** - Streamlined workflows without interruptions\n- **Cross-platform CLI discovery** - Works on Windows, macOS, and Linux\n- **Async/await throughout** - Built on anyio for efficiency\n- **Rich CLI interface** - Beautiful terminal output with Fire\n- **Type-safe API** - Comprehensive type hints for IDE support\n- **Robust error handling** - Timeout protection and graceful failures\n\n## Installation\n\n### Prerequisites\n\nInstall the Gemini CLI via npm:\n\n```bash\nnpm install -g @google/gemini-cli\n```\n\nOr set the path to an existing installation:\n\n```bash\nexport GEMINI_CLI_PATH=/path/to/gemini\n```\n\n### Basic Installation\n\n```bash\n# Core package only\npip install claif_gem\n\n# With Claif framework\npip install claif claif_gem\n\n# All Claif providers\npip install claif[all]\n```\n\n### Installing Gemini CLI with Claif\n\n```bash\n# Using Claif's installer (recommended)\npip install claif && claif install gemini\n\n# Or using claif_gem's installer\npython -m claif_gem.install\n\n# Manual installation with bun (faster)\nbun add -g @google/gemini-cli\n```\n\n### Development Installation\n\n```bash\ngit clone https://github.com/twardoch/claif_gem.git\ncd claif_gem\npip install -e \".[dev,test]\"\n```\n\n## Usage\n\n### Basic Usage (OpenAI-Compatible)\n\n```python\nfrom claif_gem import GeminiClient\n\n# Initialize the client\nclient = GeminiClient(\n api_key=\"your-api-key\", # Optional, uses GEMINI_API_KEY env var\n cli_path=\"/path/to/gemini\" # Optional, auto-discovers\n)\n\n# Create a chat completion - exactly like OpenAI\nresponse = client.chat.completions.create(\n model=\"gemini-1.5-flash\",\n messages=[\n {\"role\": \"system\", \"content\": \"You are a helpful assistant\"},\n {\"role\": \"user\", \"content\": \"Explain machine learning\"}\n ],\n temperature=0.7,\n max_tokens=1000\n)\n\n# Access the response\nprint(response.choices[0].message.content)\nprint(f\"Model: {response.model}\")\nprint(f\"Usage: {response.usage}\")\n```\n\n### Streaming Responses\n\n```python\nfrom claif_gem import GeminiClient\n\nclient = GeminiClient()\n\n# Stream responses in real-time\nstream = client.chat.completions.create(\n model=\"gemini-1.5-pro\",\n messages=[\n {\"role\": \"user\", \"content\": \"Write a story about space exploration\"}\n ],\n stream=True\n)\n\n# Process streaming chunks\nfor chunk in stream:\n if chunk.choices[0].delta.content:\n print(chunk.choices[0].delta.content, end=\"\", flush=True)\n```\n\n## CLI Usage\n\n```bash\n# Basic query\nclaif-gem query \"Explain machine learning\"\n\n# With specific model\nclaif-gem query \"Write Python code for binary search\" --model gemini-1.5-pro\n\n# Interactive chat mode\nclaif-gem chat --model gemini-2.0-flash-exp\n\n# With system prompt\nclaif-gem query \"Translate to French\" --system \"You are a professional translator\"\n\n# Stream responses\nclaif-gem stream \"Create a detailed tutorial on REST APIs\"\n\n# Health check\nclaif-gem health\n\n# List models\nclaif-gem models\n\n# Show configuration\nclaif-gem config show\n```\n\n### Advanced Options\n\n```bash\n# Control tool approval\nclaif-gem query \"Process these files\" --auto-approve # Auto-approve tool use\nclaif-gem query \"Analyze code\" --no-auto-approve # Manual approval\n\n# Yes mode for all prompts\nclaif-gem query \"Refactor this module\" --yes-mode\n\n# Verbose output for debugging\nclaif-gem query \"Debug this error\" --verbose\n\n# Custom timeout\nclaif-gem query \"Complex analysis\" --timeout 300\n\n# Show response metrics\nclaif-gem query \"Quick question\" --show-metrics\n```\n\n### Configuration Management\n\n```bash\n# Show current config\nclaif-gem config show\n\n# Set values\nclaif-gem config set --default-model gemini-2.5-pro\nclaif-gem config set --auto-approve true\nclaif-gem config set --timeout 180\n\n# Save configuration\nclaif-gem config save\n```\n\n## Python API Usage\n\n### Basic Usage\n\n```python\nimport asyncio\nfrom claif_gem import query, GeminiOptions\n\nasync def main():\n # Simple query\n async for message in query(\"Hello, Gemini!\"):\n print(message.content)\n \n # Query with options\n options = GeminiOptions(\n model=\"gemini-2.5-pro\",\n temperature=0.7,\n system_prompt=\"You are a helpful coding assistant\",\n auto_approve=True,\n yes_mode=True\n )\n \n async for message in query(\"Explain Python decorators\", options):\n print(message.content)\n\nasyncio.run(main())\n```\n\n### Direct Client Usage\n\n```python\nfrom claif_gem.client import GeminiClient\nfrom claif_gem.types import GeminiOptions\n\nasync def use_client():\n client = GeminiClient()\n \n options = GeminiOptions(\n model=\"gemini-2.5-pro\",\n verbose=True,\n max_context_length=16000\n )\n \n async for message in client.query(\"What is machine learning?\", options):\n print(f\"[{message.role}]: {message.content}\")\n\nasyncio.run(use_client())\n```\n\n### Transport Layer Access\n\n```python\nfrom claif_gem.transport import GeminiTransport\nfrom claif_gem.types import GeminiOptions\n\nasync def direct_transport():\n transport = GeminiTransport()\n \n options = GeminiOptions(\n timeout=120,\n auto_approve=True\n )\n \n async for response in transport.send_query(\"Explain async programming\", options):\n if hasattr(response, 'content'):\n print(response.content)\n\nasyncio.run(direct_transport())\n```\n\n### Error Handling\n\n```python\nfrom claif.common import ProviderError, TransportError\nfrom claif_gem import query, GeminiOptions\n\nasync def safe_query():\n try:\n options = GeminiOptions(timeout=60)\n async for message in query(\"Complex task\", options):\n print(message.content)\n \n except TransportError as e:\n print(f\"Transport error: {e}\")\n # Retry with different settings\n \n except ProviderError as e:\n print(f\"Gemini error: {e}\")\n \n except Exception as e:\n print(f\"Unexpected error: {e}\")\n\nasyncio.run(safe_query())\n```\n\n### Using with Claif Framework\n\n```python\nfrom claif import query as claif_query, Provider, ClaifOptions\n\nasync def use_with_claif():\n options = ClaifOptions(\n provider=Provider.GEMINI,\n model=\"gemini-2.5-pro\",\n temperature=0.5,\n system_prompt=\"You are a data science expert\"\n )\n \n async for message in claif_query(\"Explain neural networks\", options):\n print(message.content)\n\nasyncio.run(use_with_claif())\n```\n\n## How It Works\n\n### Architecture Overview\n\n```\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502 User Application \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 Claif Core \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 claif_gem \u2502\n\u2502 \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510 \u2502\n\u2502 \u2502 __init__.py \u2502 \u2502 \u2190 Main entry point, Claif interface\n\u2502 \u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524 \u2502\n\u2502 \u2502 cli.py \u2502 \u2502 \u2190 Fire-based CLI commands\n\u2502 \u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524 \u2502\n\u2502 \u2502 client.py \u2502 \u2502 \u2190 Client orchestration\n\u2502 \u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524 \u2502\n\u2502 \u2502 transport.py \u2502 \u2502 \u2190 Subprocess management\n\u2502 \u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524 \u2502\n\u2502 \u2502 types.py \u2502 \u2502 \u2190 Type definitions\n\u2502 \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518 \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 Subprocess Layer \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 Gemini CLI Binary \u2502 \u2190 External Node.js CLI\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n```\n\n### Core Components\n\n#### Main Module (`__init__.py`)\n\nEntry point providing the `query()` function:\n\n```python\nasync def query(\n prompt: str,\n options: ClaifOptions | None = None\n) -> AsyncIterator[Message]:\n \"\"\"Query Gemini with Claif-compatible interface.\"\"\"\n # Convert options\n gemini_options = _convert_options(options) if options else GeminiOptions()\n \n # Delegate to client\n async for message in _client.query(prompt, gemini_options):\n yield message\n```\n\nFeatures:\n- Thin wrapper design\n- Option conversion between Claif and Gemini formats\n- Module-level client instance\n- Clean async generator interface\n\n#### CLI Module (`cli.py`)\n\nFire-based command-line interface:\n\n```python\nclass GeminiCLI:\n def query(self, prompt: str, **kwargs):\n \"\"\"Execute a query to Gemini.\"\"\"\n \n def stream(self, prompt: str, **kwargs):\n \"\"\"Stream responses in real-time.\"\"\"\n \n def health(self):\n \"\"\"Check Gemini CLI availability.\"\"\"\n \n def config(self, action: str = \"show\", **kwargs):\n \"\"\"Manage configuration.\"\"\"\n```\n\nFeatures:\n- Rich console output with progress indicators\n- Response formatting and metrics\n- Async execution with error handling\n- Configuration persistence\n\n#### Client Module (`client.py`)\n\nManages the query lifecycle:\n\n```python\nclass GeminiClient:\n def __init__(self):\n self.transport = GeminiTransport()\n \n async def query(self, prompt: str, options: GeminiOptions):\n # Send query via transport\n async for gemini_msg in self.transport.send_query(prompt, options):\n # Convert to Claif format\n yield self._convert_message(gemini_msg)\n```\n\nFeatures:\n- Transport lifecycle management\n- Message format conversion\n- Error propagation\n- Clean separation of concerns\n\n#### Transport Module (`transport.py`)\n\nHandles subprocess communication:\n\n```python\nclass GeminiTransport:\n async def send_query(self, prompt: str, options: GeminiOptions):\n # Find CLI\n cli_path = self._find_cli_path()\n \n # Build command\n cmd = self._build_command(cli_path, prompt, options)\n \n # Execute and stream\n async with await anyio.open_process(cmd) as proc:\n async for line in proc.stdout:\n yield self._parse_line(line)\n```\n\nKey methods:\n- `_find_cli_path()` - Multi-location CLI discovery\n- `_build_command()` - Safe argument construction\n- `_parse_output_line()` - JSON and plain text parsing\n- Timeout management with process cleanup\n\n#### Types Module (`types.py`)\n\nType definitions and conversions:\n\n```python\n@dataclass\nclass GeminiOptions:\n model: str | None = None\n temperature: float | None = None\n system_prompt: str | None = None\n auto_approve: bool = True\n yes_mode: bool = True\n max_context_length: int | None = None\n timeout: int | None = None\n verbose: bool = False\n\n@dataclass\nclass GeminiMessage:\n role: str\n content: str\n metadata: dict[str, Any] | None = None\n \n def to_claif_message(self) -> Message:\n \"\"\"Convert to Claif format.\"\"\"\n```\n\n### Message Flow\n\n1. **User Input** \u2192 CLI command or API call\n2. **Option Translation** \u2192 ClaifOptions \u2192 GeminiOptions\n3. **Client Processing** \u2192 GeminiClient prepares query\n4. **Transport Execution**:\n - Find Gemini CLI binary\n - Build command with arguments\n - Spawn subprocess with anyio\n - Read stdout/stderr streams\n5. **Response Parsing**:\n - Try JSON parsing first\n - Fallback to plain text\n - Convert to GeminiMessage\n6. **Message Conversion** \u2192 GeminiMessage \u2192 Claif Message\n7. **Async Yield** \u2192 Messages yielded to caller\n\n### CLI Discovery\n\nThe transport searches for Gemini CLI in this order:\n\n1. `GEMINI_CLI_PATH` environment variable\n2. System PATH (`which gemini`)\n3. Common installation paths:\n - `~/.local/bin/gemini`\n - `/usr/local/bin/gemini`\n - `/opt/gemini/bin/gemini`\n4. NPM global paths:\n - Windows: `%APPDATA%/npm/gemini.cmd`\n - Unix: `~/.npm-global/bin/gemini`\n - System: `/usr/local/lib/node_modules/.bin/gemini`\n\n### Command Construction\n\nThe Gemini CLI is invoked with arguments based on options:\n\n```bash\ngemini \\\n -m <model> \\\n -a # auto-approve\n -y # yes-mode\n -t <temp> # temperature\n -s <prompt> # system prompt\n --max-context <length> \\\n -p \"user prompt\"\n```\n\n### Code Structure\n\n```\nclaif_gem/\n\u251c\u2500\u2500 src/claif_gem/\n\u2502 \u251c\u2500\u2500 __init__.py # Main entry point\n\u2502 \u251c\u2500\u2500 cli.py # Fire CLI implementation\n\u2502 \u251c\u2500\u2500 client.py # Client orchestration\n\u2502 \u251c\u2500\u2500 transport.py # Subprocess management\n\u2502 \u251c\u2500\u2500 types.py # Type definitions\n\u2502 \u2514\u2500\u2500 install.py # CLI installation helper\n\u251c\u2500\u2500 tests/\n\u2502 \u2514\u2500\u2500 test_package.py # Basic tests\n\u251c\u2500\u2500 pyproject.toml # Package configuration\n\u251c\u2500\u2500 README.md # This file\n\u2514\u2500\u2500 CLAUDE.md # Development guide\n```\n\n### Configuration\n\nEnvironment variables:\n- `GEMINI_CLI_PATH` - Path to Gemini CLI binary\n- `GEMINI_SDK=1` - Set by transport to indicate SDK usage\n- `CLAIF_PROVIDER=gemini` - Provider identification\n\nConfig file (`~/.claif/config.json`):\n```json\n{\n \"providers\": {\n \"gemini\": {\n \"model\": \"gemini-2.5-pro\",\n \"auto_approve\": true,\n \"yes_mode\": true,\n \"max_context_length\": 32000,\n \"timeout\": 180\n }\n }\n}\n```\n\n## Installation with Bun\n\nFor faster installation, use Bun:\n\n```bash\n# Install bun\ncurl -fsSL https://bun.sh/install | bash\n\n# Install Gemini CLI\nbun add -g @google/gemini-cli\n\n# Or use Claif's bundled installer\npip install claif\nclaif install gemini # Uses bun internally\n```\n\nBenefits of Bun:\n- 10x faster npm installs\n- Creates standalone executables\n- No Node.js version conflicts\n- Cross-platform compatibility\n\n## Why Use claif_gem?\n\n### 1. **Unified Interface**\n- Access Gemini through standard Claif API\n- Switch between providers with one parameter\n- Consistent error handling across providers\n\n### 2. **Cross-Platform**\n- Automatic CLI discovery on all platforms\n- Platform-specific path handling\n- Works in diverse environments\n\n### 3. **Developer Experience**\n- Full type hints for IDE support\n- Rich CLI with progress indicators\n- Clean async/await patterns\n- Comprehensive error messages\n\n### 4. **Production Ready**\n- Robust subprocess management\n- Timeout protection\n- Graceful error recovery\n- Extensive logging\n\n### 5. **Flexible Configuration**\n- Environment variables\n- Config files\n- CLI arguments\n- Sensible defaults\n\n## API Compatibility\n\nThis package is fully compatible with the OpenAI Python client API:\n\n```python\n# You can use it as a drop-in replacement\nfrom claif_gem import GeminiClient as OpenAI\n\nclient = OpenAI()\n# Now use exactly like the OpenAI client\nresponse = client.chat.completions.create(\n model=\"gemini-1.5-flash\",\n messages=[{\"role\": \"user\", \"content\": \"Hello!\"}]\n)\n```\n\n## Migration from Old Async API\n\nIf you were using the old async-based Claif API:\n\n```python\n# Old API (deprecated)\nimport asyncio\nfrom claif_gem import query\n\nasync def old_way():\n async for message in query(\"Hello Gemini\"):\n print(message.content)\n \n# New API (OpenAI-compatible)\nfrom claif_gem import GeminiClient\n\ndef new_way():\n client = GeminiClient()\n response = client.chat.completions.create(\n messages=[{\"role\": \"user\", \"content\": \"Hello Gemini\"}],\n model=\"gemini-1.5-flash\"\n )\n print(response.choices[0].message.content)\n```\n\n### Key Changes\n\n1. **Synchronous by default**: No more `async/await` for basic usage\n2. **OpenAI-compatible structure**: `client.chat.completions.create()` pattern\n3. **Standard message format**: `[{\"role\": \"user\", \"content\": \"...\"}]`\n4. **Streaming support**: Use `stream=True` for real-time responses\n5. **Type-safe responses**: Returns `ChatCompletion` objects from OpenAI types\n\n## Best Practices\n\n1. **Use auto-approve for trusted operations** - Speeds up workflows\n2. **Set appropriate timeouts** - Prevent hanging on complex queries\n3. **Enable verbose mode for debugging** - See full subprocess communication\n4. **Use system prompts** - Set context for better responses\n5. **Configure max context length** - Based on your use case\n6. **Handle errors gracefully** - Implement retry logic\n7. **Use streaming for long responses** - Better user experience\n\n## Contributing\n\nSee [CLAUDE.md](CLAUDE.md) for development guidelines.\n\n### Development Setup\n\n```bash\n# Clone repository\ngit clone https://github.com/twardoch/claif_gem.git\ncd claif_gem\n\n# Install with dev dependencies\npip install -e \".[dev,test]\"\n\n# Run tests\npytest\n\n# Format code\nruff format src/claif_gem tests\n\n# Lint\nruff check src/claif_gem tests\n\n# Type check\nmypy src/claif_gem\n```\n\n### Testing\n\n```bash\n# Run all tests\npytest\n\n# Run with coverage\npytest --cov=claif_gem --cov-report=html\n\n# Run specific test\npytest tests/test_transport.py -v\n\n# Test CLI commands\npython -m claif_gem.cli health\npython -m claif_gem.cli models\n```\n\n## License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n\nCopyright (c) 2025 Adam Twardoch\n\n## Links\n\n### claif_gem Resources\n\n- [GitHub Repository](https://github.com/twardoch/claif_gem) - Source code\n- [PyPI Package](https://pypi.org/project/claif_gem/) - Latest release\n- [Issue Tracker](https://github.com/twardoch/claif_gem/issues) - Bug reports\n- [Discussions](https://github.com/twardoch/claif_gem/discussions) - Q&A\n\n### Related Projects\n\n**Claif Ecosystem:**\n- [Claif](https://github.com/twardoch/claif) - Main framework\n- [claif_cla](https://github.com/twardoch/claif_cla) - Claude provider\n- [claif_cod](https://github.com/twardoch/claif_cod) - Codex provider\n\n**Upstream Projects:**\n- [Gemini CLI](https://github.com/google-gemini/gemini-cli/) - Google's CLI\n- [Google AI Studio](https://ai.google.dev/) - Gemini documentation\n- [Google AI Python SDK](https://github.com/google/generative-ai-python) - Python SDK\n\n**Tools & Libraries:**\n- [Fire](https://github.com/google/python-fire) - CLI framework\n- [Rich](https://github.com/Textualize/rich) - Terminal formatting\n- [anyio](https://github.com/agronholm/anyio) - Async compatibility\n- [Bun](https://bun.sh) - Fast JavaScript runtime",
"bugtrack_url": null,
"license": "MIT",
"summary": "A Claif provider for Google Gemini, compatible with the OpenAI Responses API.",
"version": "1.0.29",
"project_urls": {
"Documentation": "https://github.com/twardoch/claif_gem#readme",
"Issues": "https://github.com/twardoch/claif_gem/issues",
"Source": "https://github.com/twardoch/claif_gem"
},
"split_keywords": [
"ai",
" artificial-intelligence",
" claif",
" cli",
" command-line",
" gemini",
" google",
" llm",
" openai",
" responses-api"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "ad2d3370cac818a7be9ef998a8d030d9ec3eea7de7c7fc1a53a0100ead9c75b3",
"md5": "e1ae14a380063a1ffe8b757d6a672062",
"sha256": "90396456121a2e688b0757b6d1380ae16de39f01bbccbeb0c30efcc9297b4297"
},
"downloads": -1,
"filename": "claif_gem-1.0.29-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e1ae14a380063a1ffe8b757d6a672062",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.12",
"size": 15540,
"upload_time": "2025-07-29T01:50:14",
"upload_time_iso_8601": "2025-07-29T01:50:14.657519Z",
"url": "https://files.pythonhosted.org/packages/ad/2d/3370cac818a7be9ef998a8d030d9ec3eea7de7c7fc1a53a0100ead9c75b3/claif_gem-1.0.29-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "b0b989ccefe7620c2d0664d5dce804b9badaab133a8cb115a029a115c5f2d331",
"md5": "77a42c4f8d299ccdca41e13b614915d4",
"sha256": "4266647ea27607f71b5778a46e61ea5517ac00de22ae325cb04878dedeccdd62"
},
"downloads": -1,
"filename": "claif_gem-1.0.29.tar.gz",
"has_sig": false,
"md5_digest": "77a42c4f8d299ccdca41e13b614915d4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.12",
"size": 48098,
"upload_time": "2025-07-29T01:50:16",
"upload_time_iso_8601": "2025-07-29T01:50:16.072260Z",
"url": "https://files.pythonhosted.org/packages/b0/b9/89ccefe7620c2d0664d5dce804b9badaab133a8cb115a029a115c5f2d331/claif_gem-1.0.29.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-29 01:50:16",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "twardoch",
"github_project": "claif_gem#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "claif_gem"
}