# Freddy Python SDK
[](https://badge.fury.io/py/aitronos-freddy)
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
Official Python SDK for [Freddy AI Assistant API](https://freddy-api.aitronos.com)
## Features
- **AI Responses**: Generate text, structured outputs, and tool-augmented responses
- **File Management**: Upload, list, retrieve, and delete files
- **Vector Stores**: Create and manage knowledge bases for RAG
- **Image Generation**: Generate, upscale, and manipulate images with DALL-E and ClipDrop
- **Sync & Async**: Full support for both synchronous and asynchronous operations
- **Type Hints**: Comprehensive type annotations for better IDE support
- **Production Ready**: Error handling, timeouts, and context managers
## Installation
```bash
pip install aitronos-freddy
```
## Quick Start
```python
import os
from freddy import FreddyClient
# Initialize client
client = FreddyClient(api_key=os.getenv("FREDDY_API_KEY"))
# Create AI response
response = client.responses.create(
model="gpt-4.1",
inputs=[
{"role": "user", "texts": [{"text": "Hello, Freddy!"}]}
],
organization_id="your-org-id"
)
print(response.output[0].content[0].text)
client.close()
```
### Using Context Managers (Recommended)
```python
with FreddyClient(api_key="your-api-key") as client:
response = client.responses.create(
model="gpt-4.1",
inputs=[{"role": "user", "texts": [{"text": "Hello!"}]}]
)
print(response.output[0].content[0].text)
```
### Async Usage
```python
import asyncio
from freddy import AsyncFreddyClient
async def main():
async with AsyncFreddyClient(api_key="your-api-key") as client:
response = await client.responses.create(
model="gpt-4.1",
inputs=[{"role": "user", "texts": [{"text": "Hello!"}]}]
)
print(response.output[0].content[0].text)
asyncio.run(main())
```
## Core Features
### AI Responses
```python
# Basic text generation
response = client.responses.create(
model="gpt-4.1",
inputs=[{"role": "user", "texts": [{"text": "Explain quantum computing"}]}],
temperature=0.7,
max_tokens=500
)
# With conversation context
response = client.responses.create(
model="claude-3-5-sonnet",
inputs=[
{"role": "user", "texts": [{"text": "What is Python?"}]},
{"role": "assistant", "texts": [{"text": "Python is a programming language..."}]},
{"role": "user", "texts": [{"text": "Show me an example"}]}
],
thread_id="thread_abc123"
)
```
### File Management
```python
# Upload file
file = client.files.upload(
organization_id="org_123",
file="document.pdf",
purpose="vector_store"
)
# List files
files = client.files.list(
organization_id="org_123",
page=1,
page_size=20
)
# Get file details
file = client.files.retrieve("org_123", "file_abc")
# Delete file
client.files.delete("org_123", "file_abc")
```
### Vector Stores
```python
# Create vector store
store = client.vector_stores.create(
organization_id="org_123",
name="Company Knowledge Base",
description="Internal documentation",
access_mode="organization"
)
# Add file to vector store
client.vector_stores.add_file("org_123", store.id, "file_abc")
# List vector stores
stores = client.vector_stores.list("org_123")
# Query with RAG
response = client.responses.create(
model="gpt-4.1",
inputs=[{"role": "user", "texts": [{"text": "What is our return policy?"}]}],
tools=[{
"type": "file_search",
"vectorStoreIds": [store.id]
}]
)
```
### Image Generation
```python
# Generate images
result = client.images.generate(
organization_id="org_123",
prompt="A serene mountain landscape at sunset",
model="dall-e-3",
size="1024x1024"
)
# Save to disk
result.save_all("./images", prefix="mountain")
# Upscale image
upscaled = client.images.upscale(
organization_id="org_123",
image="photo.jpg",
target_width=2048,
target_height=2048
)
# Remove background
no_bg = client.images.remove_background(
organization_id="org_123",
image="photo.jpg"
)
# Replace background
new_bg = client.images.replace_background(
organization_id="org_123",
image="photo.jpg",
prompt="tropical beach with palm trees"
)
```
## Configuration
### Custom Base URL
```python
client = FreddyClient(
api_key="your-key",
base_url="https://custom-api.example.com"
)
```
### Custom Timeout
```python
client = FreddyClient(
api_key="your-key",
timeout=60.0 # 60 seconds
)
```
## Error Handling
```python
from freddy import FreddyClient
from freddy.exceptions import (
AuthenticationError,
RateLimitError,
APIError
)
try:
with FreddyClient(api_key="invalid-key") as client:
response = client.responses.create(...)
except AuthenticationError as e:
print(f"Authentication failed: {e.message}")
except RateLimitError as e:
print(f"Rate limit exceeded: {e.message}")
except APIError as e:
print(f"API error {e.status_code}: {e.message}")
```
## Examples
See the [examples/](examples/) directory for more complete examples:
- [basic_usage.py](examples/basic_usage.py) - Simple AI response generation
- [file_upload.py](examples/file_upload.py) - File management operations
- [vector_stores.py](examples/vector_stores.py) - Knowledge base creation
- [image_generation.py](examples/image_generation.py) - Image operations
- [async_usage.py](examples/async_usage.py) - Async client usage
## Documentation
- [API Reference](https://freddy-api.aitronos.com/docs)
- [OpenAPI Specification](https://freddy-api.aitronos.com/openapi.json)
## Development
```bash
# Clone repository
git clone https://github.com/aitronos/freddy-python.git
cd freddy-python
# Install Poetry
curl -sSL https://install.python-poetry.org | python3 -
# Install dependencies
poetry install
# Run unit tests
poetry run pytest
# Run integration tests (requires API credentials)
# Option 1: Use .env file (recommended)
cp env.example .env
# Edit .env with your credentials
# Option 2: Export variables
export FREDDY_API_KEY=your-key
export FREDDY_ORG_ID=your-org-id
# Run tests
poetry run pytest tests/integration/ --integration --env=staging
# Format code
poetry run black freddy tests examples
poetry run ruff check freddy tests examples
```
See [TESTING.md](TESTING.md) for comprehensive testing guide.
## Requirements
- Python 3.11+
- httpx >= 0.25.0
- pydantic >= 2.9.0
## License
MIT License - see [LICENSE](LICENSE) for details.
## Support
- Documentation: https://freddy-api.aitronos.com/docs
- Issues: https://github.com/aitronos/freddy-python/issues
- Email: phillip.loacker@aitronos.com
## Changelog
See [CHANGELOG.md](CHANGELOG.md) for version history and updates.
Raw data
{
"_id": null,
"home_page": "https://freddy-api.aitronos.com",
"name": "aitronos-freddy",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.11",
"maintainer_email": null,
"keywords": "ai, assistant, api, freddy, llm",
"author": "Aitronos",
"author_email": "phillip.loacker@aitronos.com",
"download_url": "https://files.pythonhosted.org/packages/7c/f2/daa05d02371249a8fd651dfb0eb3f507ab718ed83f45b225c81e68cc8bdd/aitronos_freddy-0.1.0.tar.gz",
"platform": null,
"description": "# Freddy Python SDK\n\n[](https://badge.fury.io/py/aitronos-freddy)\n[](https://www.python.org/downloads/)\n[](https://opensource.org/licenses/MIT)\n\nOfficial Python SDK for [Freddy AI Assistant API](https://freddy-api.aitronos.com)\n\n## Features\n\n- **AI Responses**: Generate text, structured outputs, and tool-augmented responses\n- **File Management**: Upload, list, retrieve, and delete files\n- **Vector Stores**: Create and manage knowledge bases for RAG\n- **Image Generation**: Generate, upscale, and manipulate images with DALL-E and ClipDrop\n- **Sync & Async**: Full support for both synchronous and asynchronous operations\n- **Type Hints**: Comprehensive type annotations for better IDE support\n- **Production Ready**: Error handling, timeouts, and context managers\n\n## Installation\n\n```bash\npip install aitronos-freddy\n```\n\n## Quick Start\n\n```python\nimport os\nfrom freddy import FreddyClient\n\n# Initialize client\nclient = FreddyClient(api_key=os.getenv(\"FREDDY_API_KEY\"))\n\n# Create AI response\nresponse = client.responses.create(\n model=\"gpt-4.1\",\n inputs=[\n {\"role\": \"user\", \"texts\": [{\"text\": \"Hello, Freddy!\"}]}\n ],\n organization_id=\"your-org-id\"\n)\n\nprint(response.output[0].content[0].text)\nclient.close()\n```\n\n### Using Context Managers (Recommended)\n\n```python\nwith FreddyClient(api_key=\"your-api-key\") as client:\n response = client.responses.create(\n model=\"gpt-4.1\",\n inputs=[{\"role\": \"user\", \"texts\": [{\"text\": \"Hello!\"}]}]\n )\n print(response.output[0].content[0].text)\n```\n\n### Async Usage\n\n```python\nimport asyncio\nfrom freddy import AsyncFreddyClient\n\nasync def main():\n async with AsyncFreddyClient(api_key=\"your-api-key\") as client:\n response = await client.responses.create(\n model=\"gpt-4.1\",\n inputs=[{\"role\": \"user\", \"texts\": [{\"text\": \"Hello!\"}]}]\n )\n print(response.output[0].content[0].text)\n\nasyncio.run(main())\n```\n\n## Core Features\n\n### AI Responses\n\n```python\n# Basic text generation\nresponse = client.responses.create(\n model=\"gpt-4.1\",\n inputs=[{\"role\": \"user\", \"texts\": [{\"text\": \"Explain quantum computing\"}]}],\n temperature=0.7,\n max_tokens=500\n)\n\n# With conversation context\nresponse = client.responses.create(\n model=\"claude-3-5-sonnet\",\n inputs=[\n {\"role\": \"user\", \"texts\": [{\"text\": \"What is Python?\"}]},\n {\"role\": \"assistant\", \"texts\": [{\"text\": \"Python is a programming language...\"}]},\n {\"role\": \"user\", \"texts\": [{\"text\": \"Show me an example\"}]}\n ],\n thread_id=\"thread_abc123\"\n)\n```\n\n### File Management\n\n```python\n# Upload file\nfile = client.files.upload(\n organization_id=\"org_123\",\n file=\"document.pdf\",\n purpose=\"vector_store\"\n)\n\n# List files\nfiles = client.files.list(\n organization_id=\"org_123\",\n page=1,\n page_size=20\n)\n\n# Get file details\nfile = client.files.retrieve(\"org_123\", \"file_abc\")\n\n# Delete file\nclient.files.delete(\"org_123\", \"file_abc\")\n```\n\n### Vector Stores\n\n```python\n# Create vector store\nstore = client.vector_stores.create(\n organization_id=\"org_123\",\n name=\"Company Knowledge Base\",\n description=\"Internal documentation\",\n access_mode=\"organization\"\n)\n\n# Add file to vector store\nclient.vector_stores.add_file(\"org_123\", store.id, \"file_abc\")\n\n# List vector stores\nstores = client.vector_stores.list(\"org_123\")\n\n# Query with RAG\nresponse = client.responses.create(\n model=\"gpt-4.1\",\n inputs=[{\"role\": \"user\", \"texts\": [{\"text\": \"What is our return policy?\"}]}],\n tools=[{\n \"type\": \"file_search\",\n \"vectorStoreIds\": [store.id]\n }]\n)\n```\n\n### Image Generation\n\n```python\n# Generate images\nresult = client.images.generate(\n organization_id=\"org_123\",\n prompt=\"A serene mountain landscape at sunset\",\n model=\"dall-e-3\",\n size=\"1024x1024\"\n)\n\n# Save to disk\nresult.save_all(\"./images\", prefix=\"mountain\")\n\n# Upscale image\nupscaled = client.images.upscale(\n organization_id=\"org_123\",\n image=\"photo.jpg\",\n target_width=2048,\n target_height=2048\n)\n\n# Remove background\nno_bg = client.images.remove_background(\n organization_id=\"org_123\",\n image=\"photo.jpg\"\n)\n\n# Replace background\nnew_bg = client.images.replace_background(\n organization_id=\"org_123\",\n image=\"photo.jpg\",\n prompt=\"tropical beach with palm trees\"\n)\n```\n\n## Configuration\n\n### Custom Base URL\n\n```python\nclient = FreddyClient(\n api_key=\"your-key\",\n base_url=\"https://custom-api.example.com\"\n)\n```\n\n### Custom Timeout\n\n```python\nclient = FreddyClient(\n api_key=\"your-key\",\n timeout=60.0 # 60 seconds\n)\n```\n\n## Error Handling\n\n```python\nfrom freddy import FreddyClient\nfrom freddy.exceptions import (\n AuthenticationError,\n RateLimitError,\n APIError\n)\n\ntry:\n with FreddyClient(api_key=\"invalid-key\") as client:\n response = client.responses.create(...)\nexcept AuthenticationError as e:\n print(f\"Authentication failed: {e.message}\")\nexcept RateLimitError as e:\n print(f\"Rate limit exceeded: {e.message}\")\nexcept APIError as e:\n print(f\"API error {e.status_code}: {e.message}\")\n```\n\n## Examples\n\nSee the [examples/](examples/) directory for more complete examples:\n\n- [basic_usage.py](examples/basic_usage.py) - Simple AI response generation\n- [file_upload.py](examples/file_upload.py) - File management operations\n- [vector_stores.py](examples/vector_stores.py) - Knowledge base creation\n- [image_generation.py](examples/image_generation.py) - Image operations\n- [async_usage.py](examples/async_usage.py) - Async client usage\n\n## Documentation\n\n- [API Reference](https://freddy-api.aitronos.com/docs)\n- [OpenAPI Specification](https://freddy-api.aitronos.com/openapi.json)\n\n## Development\n\n```bash\n# Clone repository\ngit clone https://github.com/aitronos/freddy-python.git\ncd freddy-python\n\n# Install Poetry\ncurl -sSL https://install.python-poetry.org | python3 -\n\n# Install dependencies\npoetry install\n\n# Run unit tests\npoetry run pytest\n\n# Run integration tests (requires API credentials)\n# Option 1: Use .env file (recommended)\ncp env.example .env\n# Edit .env with your credentials\n\n# Option 2: Export variables\nexport FREDDY_API_KEY=your-key\nexport FREDDY_ORG_ID=your-org-id\n\n# Run tests\npoetry run pytest tests/integration/ --integration --env=staging\n\n# Format code\npoetry run black freddy tests examples\npoetry run ruff check freddy tests examples\n```\n\nSee [TESTING.md](TESTING.md) for comprehensive testing guide.\n\n## Requirements\n\n- Python 3.11+\n- httpx >= 0.25.0\n- pydantic >= 2.9.0\n\n## License\n\nMIT License - see [LICENSE](LICENSE) for details.\n\n## Support\n\n- Documentation: https://freddy-api.aitronos.com/docs\n- Issues: https://github.com/aitronos/freddy-python/issues\n- Email: phillip.loacker@aitronos.com\n\n## Changelog\n\nSee [CHANGELOG.md](CHANGELOG.md) for version history and updates.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Official Python SDK for Freddy AI Assistant API",
"version": "0.1.0",
"project_urls": {
"Documentation": "https://freddy-api.aitronos.com/docs",
"Homepage": "https://freddy-api.aitronos.com",
"Repository": "https://github.com/aitronos/freddy-python"
},
"split_keywords": [
"ai",
" assistant",
" api",
" freddy",
" llm"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "480c39893b1ba31cece99a439a833807ca66168733b7f44b0764ad0fe0e6bdf3",
"md5": "30275a79fea71c7f27051a319e99e301",
"sha256": "f72dba341855a50d73a2efad6aa0af9df3fef774adf81b703352a8a6da99d46b"
},
"downloads": -1,
"filename": "aitronos_freddy-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "30275a79fea71c7f27051a319e99e301",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.11",
"size": 16144,
"upload_time": "2025-10-16T18:46:54",
"upload_time_iso_8601": "2025-10-16T18:46:54.843881Z",
"url": "https://files.pythonhosted.org/packages/48/0c/39893b1ba31cece99a439a833807ca66168733b7f44b0764ad0fe0e6bdf3/aitronos_freddy-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7cf2daa05d02371249a8fd651dfb0eb3f507ab718ed83f45b225c81e68cc8bdd",
"md5": "12db020986bed7e553ebc2f044d128b8",
"sha256": "4b83ebc3f55114418173544e7d182f8b5c5d9aaf18a2bb3a5a7c2b560ba7a3b8"
},
"downloads": -1,
"filename": "aitronos_freddy-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "12db020986bed7e553ebc2f044d128b8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.11",
"size": 13362,
"upload_time": "2025-10-16T18:46:55",
"upload_time_iso_8601": "2025-10-16T18:46:55.999271Z",
"url": "https://files.pythonhosted.org/packages/7c/f2/daa05d02371249a8fd651dfb0eb3f507ab718ed83f45b225c81e68cc8bdd/aitronos_freddy-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-16 18:46:55",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "aitronos",
"github_project": "freddy-python",
"github_not_found": true,
"lcname": "aitronos-freddy"
}