# π Harvester SDK - Complete AI Processing Platform
> **"The unified interface for all AI providers with enterprise-grade reliability."**
[](LICENSE)
[](https://www.python.org/downloads/)
[](#providers)
## π What is Harvester SDK?
Harvester SDK is a comprehensive AI processing platform that provides a **unified interface** to all major AI providers. Whether you need text generation, image creation, batch processing, or real-time conversations, Harvester SDK handles the complexity so you can focus on building.
### β‘ Key Features
- **Multi-Provider Support** - OpenAI, Anthropic, Google AI Studio, Vertex AI, XAI, DeepSeek
- **Dual Authentication** - API keys (GenAI) and service accounts (Vertex AI)
- **Turn-Based Chat** - Non-streaming conversations with any model
- **Batch Processing** - Cost-effective bulk operations with 50% savings
- **Template System** - AI-powered code transformation
- **Image Generation** - DALL-E, Imagen, GPT Image support
- **Enterprise Ready** - Rate limiting, retries, error handling
## π Quick Start
### Installation
```bash
# Install the SDK
pip install harvester-sdk
# Install with all providers
pip install harvester-sdk[all]
# Install specific providers
pip install harvester-sdk[openai,anthropic,genai]
```
### Basic Usage
```bash
# Main CLI conductor
harvester --help
# Turn-based conversation (non-streaming)
harvester message --model gemini-2.5-flash
harvester message --model claude-sonnet-4-20250514 --system "You are a helpful assistant"
# Batch processing from CSV
harvester batch data.csv --model gpt-4o --template quick
# Process directory with templates
harvester process ./src --template refactor --model gemini-2.5-pro
# Generate images
harvester image "A beautiful sunset" --provider dalle3 --size 1024x1024
```
## π§ Provider Configuration
### Google AI Studio (GenAI) - API Key Authentication
```bash
export GEMINI_API_KEY=your_api_key
harvester message --model gemini-2.5-flash
```
### Google Vertex AI - Service Account Authentication
```bash
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json
harvester message --model vtx-gemini-2.5-flash
```
### Other Providers
```bash
export OPENAI_API_KEY=your_openai_key
export ANTHROPIC_API_KEY=your_anthropic_key
export XAI_API_KEY=your_xai_key
export DEEPSEEK_API_KEY=your_deepseek_key
```
## π Available Commands
### Core Commands
- `harvester message` - Turn-based conversations (non-streaming)
- `harvester chat` - Interactive streaming chat
- `harvester batch` - Batch process CSV files
- `harvester process` - Directory processing with templates
- `harvester image` - Image generation
- `harvester search` - AI-enhanced web search (Grok)
### Utility Commands
- `harvester list-models` - Show available models
- `harvester config --show` - Display configuration
- `harvester tier` - Show license information
- `harvester status` - Check batch job status
## π― Model Selection Guide
### Google AI Models
| **API Key (GenAI)** | **Service Account (Vertex)** | **Use Case** |
|---------------------|------------------------------|--------------|
| `gemini-2.5-flash` | `vtx-gemini-2.5-flash` | Fast, cost-effective |
| `gemini-2.5-pro` | `vtx-gemini-2.5-pro` | High-quality reasoning |
| `gemini-1.5-flash` | `vtx-gemini-1.5-flash` | Legacy support |
### Other Providers
- **OpenAI**: `gpt-4o`, `gpt-4o-mini`
- **Anthropic**: `claude-sonnet-4-20250514`, `claude-opus-4-1-20250805`
- **XAI**: `grok-4-0709`, `grok-3`, `grok-3-mini`
- **DeepSeek**: `deepseek-chat`, `deepseek-reasoner`
## πΌ Programming Interface
### Python SDK Usage
```python
from harvester_sdk import HarvesterSDK
# Initialize SDK
sdk = HarvesterSDK()
# Quick processing
result = await sdk.quick_process(
prompt="Explain quantum computing",
model="gemini-2.5-pro"
)
# Batch processing
results = await sdk.process_batch(
requests=["What is AI?", "Explain ML", "Define neural networks"],
model="claude-sonnet-4-20250514"
)
# Multi-provider council (get consensus)
consensus = await sdk.quick_council(
prompt="What is consciousness?",
models=["gemini-2.5-pro", "claude-sonnet-4-20250514", "gpt-4o"]
)
```
### Provider Factory
```python
from providers.provider_factory import ProviderFactory
# Create provider factory
factory = ProviderFactory()
# Get provider for specific model
provider = factory.get_provider("gemini-2.5-flash") # -> GenAI provider
provider = factory.get_provider("vtx-gemini-2.5-flash") # -> Vertex AI provider
# Generate completion
response = await provider.complete("Hello, world!", "gemini-2.5-flash")
```
## ποΈ Architecture
```
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β HARVESTER SDK β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Main CLI Conductor β
β (harvester command) β
ββββββββββββ¬βββββββββββ¬βββββββββββ¬βββββββββββ¬βββββββββββββ€
β Message β Batch β Process β Image β Search β
β(Non-str) β CSV β Dir β Gen β Enhanced β
ββββββββββββ΄βββββββββββ΄βββββββββββ΄βββββββββββ΄βββββββββββββ€
β Provider Factory β
ββββββββββββ¬βββββββββββ¬βββββββββββ¬βββββββββββ¬βββββββββββββ€
β GenAI β Vertex β OpenAI βAnthropic β XAI β
β(API Key) β(Service) β β β DeepSeek β
ββββββββββββ΄βββββββββββ΄βββββββββββ΄βββββββββββ΄βββββββββββββ
```
## π Authentication Methods
### Clear Separation for Google Services
**Google AI Studio (GenAI)**:
- β
Simple API key: `GEMINI_API_KEY`
- β
Models: `gemini-2.5-flash`, `gemini-2.5-pro`
- β
Best for: Personal use, quick setup
**Google Vertex AI**:
- β
Service account: `GOOGLE_APPLICATION_CREDENTIALS`
- β
Models: `vtx-gemini-2.5-flash`, `vtx-gemini-2.5-pro`
- β
Best for: Enterprise, GCP integration
## π° License Tiers
| Tier | Features | Max Workers | Providers |
|------|----------|-------------|-----------|
| **Freemium** | Basic tools | 5 | Limited |
| **Professional** | Full access | 20 | All |
| **Premium** | Advanced features | 75 | All + Priority |
| **Enterprise** | Unlimited | β | All + SLA |
Check your tier: `harvester tier`
## π Examples
### Turn-Based Conversation
```bash
# Start a conversation with Gemini
harvester message --model gemini-2.5-flash --save
# Chat with Claude
harvester message --model claude-sonnet-4-20250514 --temperature 0.3
# System prompt example
harvester message --model grok-4-0709 --system "You are an expert programmer"
```
### Batch Processing
```bash
# Process CSV with AI
harvester batch questions.csv --model gemini-2.5-pro --template analysis
# Directory transformation
harvester process ./legacy_code --template modernize --model claude-sonnet-4-20250514
```
### Image Generation
```bash
# DALL-E 3
harvester image "A futuristic city" --provider dalle3 --quality hd
# Imagen 4
harvester image "Abstract art" --provider vertex_image --model imagen-4
```
## π€ Support & Contributing
- **Documentation**: Full guides in `/docs`
- **Issues**: Report bugs via GitHub issues
- **Enterprise**: Contact info@quantumencoding.io
- **License**: Commercial - see LICENSE file
## π Why Harvester SDK?
1. **Unified Interface** - One API for all providers
2. **Authentication Clarity** - Clear separation of auth methods
3. **Production Ready** - Error handling, retries, rate limiting
4. **Flexible Deployment** - CLI tools + Python SDK
5. **Cost Optimization** - Batch processing with 50% savings
6. **Multi-Modal** - Text, images, and more
7. **Enterprise Grade** - Licensed, supported, documented
---
**Β© 2025 QUANTUM ENCODING LTD**
π§ Contact: info@quantumencoding.io
π Website: https://quantumencoding.io
*The complete AI processing platform for modern applications.*
Raw data
{
"_id": null,
"home_page": "https://github.com/quantum-encoding/harvester-sdk",
"name": "harvester-sdk",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "\"Rich (The Architect)\" <rich@quantumencoding.io>",
"keywords": "ai, parallel-processing, batch-processing, openai, anthropic, google-ai, vertex-ai, genai, gemini, xai, grok, deepseek, multi-provider, async, sdk, harvesting-engine, automation, turn-based-chat, non-streaming",
"author": "Quantum Encoding",
"author_email": "Quantum Encoding Ltd <rich@quantumencoding.io>",
"download_url": "https://files.pythonhosted.org/packages/b5/2a/4682f148cd1888fa225265d835df3afa3a06ed5e474e97bf2d44d2b025bb/harvester_sdk-1.0.3.tar.gz",
"platform": null,
"description": "# \ud83d\ude80 Harvester SDK - Complete AI Processing Platform\n\n> **\"The unified interface for all AI providers with enterprise-grade reliability.\"**\n\n[](LICENSE)\n[](https://www.python.org/downloads/)\n[](#providers)\n\n## \ud83c\udf1f What is Harvester SDK?\n\nHarvester SDK is a comprehensive AI processing platform that provides a **unified interface** to all major AI providers. Whether you need text generation, image creation, batch processing, or real-time conversations, Harvester SDK handles the complexity so you can focus on building.\n\n### \u26a1 Key Features\n\n- **Multi-Provider Support** - OpenAI, Anthropic, Google AI Studio, Vertex AI, XAI, DeepSeek\n- **Dual Authentication** - API keys (GenAI) and service accounts (Vertex AI)\n- **Turn-Based Chat** - Non-streaming conversations with any model\n- **Batch Processing** - Cost-effective bulk operations with 50% savings\n- **Template System** - AI-powered code transformation\n- **Image Generation** - DALL-E, Imagen, GPT Image support\n- **Enterprise Ready** - Rate limiting, retries, error handling\n\n## \ud83d\ude80 Quick Start\n\n### Installation\n\n```bash\n# Install the SDK\npip install harvester-sdk\n\n# Install with all providers\npip install harvester-sdk[all]\n\n# Install specific providers\npip install harvester-sdk[openai,anthropic,genai]\n```\n\n### Basic Usage\n\n```bash\n# Main CLI conductor\nharvester --help\n\n# Turn-based conversation (non-streaming)\nharvester message --model gemini-2.5-flash\nharvester message --model claude-sonnet-4-20250514 --system \"You are a helpful assistant\"\n\n# Batch processing from CSV\nharvester batch data.csv --model gpt-4o --template quick\n\n# Process directory with templates\nharvester process ./src --template refactor --model gemini-2.5-pro\n\n# Generate images\nharvester image \"A beautiful sunset\" --provider dalle3 --size 1024x1024\n```\n\n## \ud83d\udd27 Provider Configuration\n\n### Google AI Studio (GenAI) - API Key Authentication\n```bash\nexport GEMINI_API_KEY=your_api_key\nharvester message --model gemini-2.5-flash\n```\n\n### Google Vertex AI - Service Account Authentication\n```bash\nexport GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json\nharvester message --model vtx-gemini-2.5-flash\n```\n\n### Other Providers\n```bash\nexport OPENAI_API_KEY=your_openai_key\nexport ANTHROPIC_API_KEY=your_anthropic_key\nexport XAI_API_KEY=your_xai_key\nexport DEEPSEEK_API_KEY=your_deepseek_key\n```\n\n## \ud83d\udccb Available Commands\n\n### Core Commands\n- `harvester message` - Turn-based conversations (non-streaming)\n- `harvester chat` - Interactive streaming chat\n- `harvester batch` - Batch process CSV files\n- `harvester process` - Directory processing with templates\n- `harvester image` - Image generation\n- `harvester search` - AI-enhanced web search (Grok)\n\n### Utility Commands\n- `harvester list-models` - Show available models\n- `harvester config --show` - Display configuration\n- `harvester tier` - Show license information\n- `harvester status` - Check batch job status\n\n## \ud83c\udfaf Model Selection Guide\n\n### Google AI Models\n\n| **API Key (GenAI)** | **Service Account (Vertex)** | **Use Case** |\n|---------------------|------------------------------|--------------|\n| `gemini-2.5-flash` | `vtx-gemini-2.5-flash` | Fast, cost-effective |\n| `gemini-2.5-pro` | `vtx-gemini-2.5-pro` | High-quality reasoning |\n| `gemini-1.5-flash` | `vtx-gemini-1.5-flash` | Legacy support |\n\n### Other Providers\n- **OpenAI**: `gpt-4o`, `gpt-4o-mini`\n- **Anthropic**: `claude-sonnet-4-20250514`, `claude-opus-4-1-20250805`\n- **XAI**: `grok-4-0709`, `grok-3`, `grok-3-mini`\n- **DeepSeek**: `deepseek-chat`, `deepseek-reasoner`\n\n## \ud83d\udcbc Programming Interface\n\n### Python SDK Usage\n\n```python\nfrom harvester_sdk import HarvesterSDK\n\n# Initialize SDK\nsdk = HarvesterSDK()\n\n# Quick processing\nresult = await sdk.quick_process(\n prompt=\"Explain quantum computing\",\n model=\"gemini-2.5-pro\"\n)\n\n# Batch processing\nresults = await sdk.process_batch(\n requests=[\"What is AI?\", \"Explain ML\", \"Define neural networks\"],\n model=\"claude-sonnet-4-20250514\"\n)\n\n# Multi-provider council (get consensus)\nconsensus = await sdk.quick_council(\n prompt=\"What is consciousness?\",\n models=[\"gemini-2.5-pro\", \"claude-sonnet-4-20250514\", \"gpt-4o\"]\n)\n```\n\n### Provider Factory\n\n```python\nfrom providers.provider_factory import ProviderFactory\n\n# Create provider factory\nfactory = ProviderFactory()\n\n# Get provider for specific model\nprovider = factory.get_provider(\"gemini-2.5-flash\") # -> GenAI provider\nprovider = factory.get_provider(\"vtx-gemini-2.5-flash\") # -> Vertex AI provider\n\n# Generate completion\nresponse = await provider.complete(\"Hello, world!\", \"gemini-2.5-flash\")\n```\n\n## \ud83c\udfd7\ufe0f Architecture\n\n```\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502 HARVESTER SDK \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 Main CLI Conductor \u2502\n\u2502 (harvester command) \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 Message \u2502 Batch \u2502 Process \u2502 Image \u2502 Search \u2502\n\u2502(Non-str) \u2502 CSV \u2502 Dir \u2502 Gen \u2502 Enhanced \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 Provider Factory \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 GenAI \u2502 Vertex \u2502 OpenAI \u2502Anthropic \u2502 XAI \u2502\n\u2502(API Key) \u2502(Service) \u2502 \u2502 \u2502 DeepSeek \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n```\n\n## \ud83d\udd12 Authentication Methods\n\n### Clear Separation for Google Services\n\n**Google AI Studio (GenAI)**:\n- \u2705 Simple API key: `GEMINI_API_KEY`\n- \u2705 Models: `gemini-2.5-flash`, `gemini-2.5-pro`\n- \u2705 Best for: Personal use, quick setup\n\n**Google Vertex AI**:\n- \u2705 Service account: `GOOGLE_APPLICATION_CREDENTIALS`\n- \u2705 Models: `vtx-gemini-2.5-flash`, `vtx-gemini-2.5-pro`\n- \u2705 Best for: Enterprise, GCP integration\n\n## \ud83d\udcb0 License Tiers\n\n| Tier | Features | Max Workers | Providers |\n|------|----------|-------------|-----------|\n| **Freemium** | Basic tools | 5 | Limited |\n| **Professional** | Full access | 20 | All |\n| **Premium** | Advanced features | 75 | All + Priority |\n| **Enterprise** | Unlimited | \u221e | All + SLA |\n\nCheck your tier: `harvester tier`\n\n## \ud83d\udcd6 Examples\n\n### Turn-Based Conversation\n\n```bash\n# Start a conversation with Gemini\nharvester message --model gemini-2.5-flash --save\n\n# Chat with Claude\nharvester message --model claude-sonnet-4-20250514 --temperature 0.3\n\n# System prompt example\nharvester message --model grok-4-0709 --system \"You are an expert programmer\"\n```\n\n### Batch Processing\n\n```bash\n# Process CSV with AI\nharvester batch questions.csv --model gemini-2.5-pro --template analysis\n\n# Directory transformation\nharvester process ./legacy_code --template modernize --model claude-sonnet-4-20250514\n```\n\n### Image Generation\n\n```bash\n# DALL-E 3\nharvester image \"A futuristic city\" --provider dalle3 --quality hd\n\n# Imagen 4\nharvester image \"Abstract art\" --provider vertex_image --model imagen-4\n```\n\n## \ud83e\udd1d Support & Contributing\n\n- **Documentation**: Full guides in `/docs`\n- **Issues**: Report bugs via GitHub issues\n- **Enterprise**: Contact info@quantumencoding.io\n- **License**: Commercial - see LICENSE file\n\n## \ud83c\udf1f Why Harvester SDK?\n\n1. **Unified Interface** - One API for all providers\n2. **Authentication Clarity** - Clear separation of auth methods\n3. **Production Ready** - Error handling, retries, rate limiting\n4. **Flexible Deployment** - CLI tools + Python SDK\n5. **Cost Optimization** - Batch processing with 50% savings\n6. **Multi-Modal** - Text, images, and more\n7. **Enterprise Grade** - Licensed, supported, documented\n\n---\n\n**\u00a9 2025 QUANTUM ENCODING LTD** \n\ud83d\udce7 Contact: info@quantumencoding.io \n\ud83c\udf10 Website: https://quantumencoding.io \n\n*The complete AI processing platform for modern applications.*\n",
"bugtrack_url": null,
"license": "LicenseRef-Proprietary",
"summary": "Harvester SDK - The Complete AI Processing Platform. Unified interface for all AI processing paradigms with enterprise-grade reliability.",
"version": "1.0.3",
"project_urls": {
"Commercial Licensing": "https://quantumencoding.io/harvester-sdk",
"Documentation": "https://harvester-sdk.readthedocs.io",
"Homepage": "https://harvester-sdk.quantumencoding.io",
"Issues": "https://github.com/quantum-encoding/harvester-sdk/issues",
"Repository": "https://github.com/quantum-encoding/harvester-sdk"
},
"split_keywords": [
"ai",
" parallel-processing",
" batch-processing",
" openai",
" anthropic",
" google-ai",
" vertex-ai",
" genai",
" gemini",
" xai",
" grok",
" deepseek",
" multi-provider",
" async",
" sdk",
" harvesting-engine",
" automation",
" turn-based-chat",
" non-streaming"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "652ef9bf4aa2d0f52e85c5c8e3c11983b43b3293e8b34e987a15d6862a679227",
"md5": "854fda18ab5bf7b7926c6b12bfde3929",
"sha256": "87674478f767e083c4a01545ee0177ab4fea41746fdd7ad7d7e19974c59e960c"
},
"downloads": -1,
"filename": "harvester_sdk-1.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "854fda18ab5bf7b7926c6b12bfde3929",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 282224,
"upload_time": "2025-09-06T12:21:50",
"upload_time_iso_8601": "2025-09-06T12:21:50.804577Z",
"url": "https://files.pythonhosted.org/packages/65/2e/f9bf4aa2d0f52e85c5c8e3c11983b43b3293e8b34e987a15d6862a679227/harvester_sdk-1.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "b52a4682f148cd1888fa225265d835df3afa3a06ed5e474e97bf2d44d2b025bb",
"md5": "5b56a39ef9d246808896509e1ab8b8a1",
"sha256": "ab6426e664cc26fe88a7a758e05eda03e413830cc9c5641650dd70fb0e8b127f"
},
"downloads": -1,
"filename": "harvester_sdk-1.0.3.tar.gz",
"has_sig": false,
"md5_digest": "5b56a39ef9d246808896509e1ab8b8a1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 375915,
"upload_time": "2025-09-06T12:21:53",
"upload_time_iso_8601": "2025-09-06T12:21:53.557242Z",
"url": "https://files.pythonhosted.org/packages/b5/2a/4682f148cd1888fa225265d835df3afa3a06ed5e474e97bf2d44d2b025bb/harvester_sdk-1.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-06 12:21:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "quantum-encoding",
"github_project": "harvester-sdk",
"github_not_found": true,
"lcname": "harvester-sdk"
}