dhenara-ai


Namedhenara-ai JSON
Version 1.0.2 PyPI version JSON
download
home_pagehttps://github.com/dhenara/dhenara-ai
SummaryDhenara Package for Multi Provider AI-Model API calls
upload_time2025-07-23 08:11:50
maintainerNone
docs_urlNone
authorDhenara
requires_python>=3.10
licenseMIT
keywords ai llm machine learning language models
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Dhenara

Dhenara is a genuinely open source Python package for interacting with various AI models in a unified way. It is a lightweight, straightforward framework for integrating multiple AI models into Python applications. It's similar in spirit to LangChain but with a focus on simplicity and minimal dependencies along with type safety using Pydantic Models.

For full documentation, visit [docs.dhenara.com](https://docs.dhenara.com/).

## Why Dhenara?

- **Genuinely Open Source**: Built from the ground up as a community resource, not an afterthought or internal tool
- **Unified API**: Interact with different AI providers through a consistent interface
- **Type Safety**: Built with Pydantic for robust type checking and validation
- **Easy Regeneration across Providers**: With a unified Pydantic output and built-in prompt formatting, send output from a model to any other model easily
- **Streaming**: First-class support for streaming responses along with accumulated responses similar to non-streaming responses
- **Async Support**: Both synchronous and asynchronous interfaces for maximum flexibility
- **Resource Management**: Automatic handling of connections, retries, and timeouts
- **Foundation Models**: Pre-configured models with sensible defaults
- **Test Mode**: Bring up your app with dummy responses for streaming and non-streaming generation
- **Cost/Usage Data**: Derived cost and usage data along with responses, with optional charge for each model endpoint for commercial deployment
- **Community-Oriented Design**: An architecture separating API credentials, models, and configurations for flexible deployment and scaling

## Example Usage

Here's a simple example of using Dhenara to interact with an AI model. You can find more examples in [docs.dhenara.com](https://docs.dhenara.com/).

```python
from dhenara.ai import AIModelClient
from dhenara.ai.types import AIModelCallConfig, AIModelEndpoint
from dhenara.ai.types.external_api import AIModelAPIProviderEnum
from dhenara.ai.types.genai import AIModelAPI
from dhenara.ai.types.genai.foundation_models.anthropic.chat import Claude37Sonnet

# Create an API
api = AIModelAPI(
    provider=AIModelAPIProviderEnum.ANTHROPIC,
    api_key="your_api_key",
)

# Create an endpoint using a pre-configured model
model_endpoint = AIModelEndpoint(
    api=api,
    ai_model=Claude37Sonnet,
)

# Configure the api call
config = AIModelCallConfig(
    max_output_tokens=16000,
    reasoning=True,  # Thinking/reasoning mode
    max_reasoning_tokens=8000,
    streaming=False,
)

# Create the client
client = AIModelClient(
    model_endpoint=model_endpoint,
    config=config,
    is_async=False,
)

# Create a prompt
prompt = {
    "role": "user",
    "content": "Explain quantum computing in simple terms",
}

# Generate a response
response = client.generate(prompt=prompt)

# If not streaming
if response.chat_response:
    print(response.chat_response.choices[0].contents[0].get_text())

# If streaming
elif response.stream_generator:
    for chunk, _ in response.stream_generator:
        if chunk:
            print(
                chunk.data.choice_deltas[0].content_deltas[0].get_text_delta(),
                end="",
                flush=True,
            )
```

## Documentation

For full documentation, visit [docs.dhenara.com](https://docs.dhenara.com/).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/dhenara/dhenara-ai",
    "name": "dhenara-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "ai, llm, machine learning, language models",
    "author": "Dhenara",
    "author_email": "support@dhenara.com",
    "download_url": "https://files.pythonhosted.org/packages/d4/92/7378ff50811212dcb8efeed4bd9073743154abc5f4dd34184e373a7cee63/dhenara_ai-1.0.2.tar.gz",
    "platform": null,
    "description": "# Dhenara\n\nDhenara is a genuinely open source Python package for interacting with various AI models in a unified way. It is a lightweight, straightforward framework for integrating multiple AI models into Python applications. It's similar in spirit to LangChain but with a focus on simplicity and minimal dependencies along with type safety using Pydantic Models.\n\nFor full documentation, visit [docs.dhenara.com](https://docs.dhenara.com/).\n\n## Why Dhenara?\n\n- **Genuinely Open Source**: Built from the ground up as a community resource, not an afterthought or internal tool\n- **Unified API**: Interact with different AI providers through a consistent interface\n- **Type Safety**: Built with Pydantic for robust type checking and validation\n- **Easy Regeneration across Providers**: With a unified Pydantic output and built-in prompt formatting, send output from a model to any other model easily\n- **Streaming**: First-class support for streaming responses along with accumulated responses similar to non-streaming responses\n- **Async Support**: Both synchronous and asynchronous interfaces for maximum flexibility\n- **Resource Management**: Automatic handling of connections, retries, and timeouts\n- **Foundation Models**: Pre-configured models with sensible defaults\n- **Test Mode**: Bring up your app with dummy responses for streaming and non-streaming generation\n- **Cost/Usage Data**: Derived cost and usage data along with responses, with optional charge for each model endpoint for commercial deployment\n- **Community-Oriented Design**: An architecture separating API credentials, models, and configurations for flexible deployment and scaling\n\n## Example Usage\n\nHere's a simple example of using Dhenara to interact with an AI model. You can find more examples in [docs.dhenara.com](https://docs.dhenara.com/).\n\n```python\nfrom dhenara.ai import AIModelClient\nfrom dhenara.ai.types import AIModelCallConfig, AIModelEndpoint\nfrom dhenara.ai.types.external_api import AIModelAPIProviderEnum\nfrom dhenara.ai.types.genai import AIModelAPI\nfrom dhenara.ai.types.genai.foundation_models.anthropic.chat import Claude37Sonnet\n\n# Create an API\napi = AIModelAPI(\n    provider=AIModelAPIProviderEnum.ANTHROPIC,\n    api_key=\"your_api_key\",\n)\n\n# Create an endpoint using a pre-configured model\nmodel_endpoint = AIModelEndpoint(\n    api=api,\n    ai_model=Claude37Sonnet,\n)\n\n# Configure the api call\nconfig = AIModelCallConfig(\n    max_output_tokens=16000,\n    reasoning=True,  # Thinking/reasoning mode\n    max_reasoning_tokens=8000,\n    streaming=False,\n)\n\n# Create the client\nclient = AIModelClient(\n    model_endpoint=model_endpoint,\n    config=config,\n    is_async=False,\n)\n\n# Create a prompt\nprompt = {\n    \"role\": \"user\",\n    \"content\": \"Explain quantum computing in simple terms\",\n}\n\n# Generate a response\nresponse = client.generate(prompt=prompt)\n\n# If not streaming\nif response.chat_response:\n    print(response.chat_response.choices[0].contents[0].get_text())\n\n# If streaming\nelif response.stream_generator:\n    for chunk, _ in response.stream_generator:\n        if chunk:\n            print(\n                chunk.data.choice_deltas[0].content_deltas[0].get_text_delta(),\n                end=\"\",\n                flush=True,\n            )\n```\n\n## Documentation\n\nFor full documentation, visit [docs.dhenara.com](https://docs.dhenara.com/).\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Dhenara Package for Multi Provider AI-Model API calls",
    "version": "1.0.2",
    "project_urls": {
        "Bug Reports": "https://github.com/dhenara/dhenara-ai/issues",
        "Documentation": "https://docs.dhenara.com/",
        "Homepage": "https://dhenara.com",
        "Source Code": "https://github.com/dhenara/dhenara-ai"
    },
    "split_keywords": [
        "ai",
        " llm",
        " machine learning",
        " language models"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1cb972ac132189ddd4cf40272c2b33f5712f2d5e1bf24641984ff7ef7f0a424c",
                "md5": "8fbba4484df6bb61ff8ecfa2fd409171",
                "sha256": "ec929812f3fd0822ce9ed0e3bb3e986b8a001bf93c841ceaeeb0155b0e6c2d70"
            },
            "downloads": -1,
            "filename": "dhenara_ai-1.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8fbba4484df6bb61ff8ecfa2fd409171",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 109023,
            "upload_time": "2025-07-23T08:11:49",
            "upload_time_iso_8601": "2025-07-23T08:11:49.377314Z",
            "url": "https://files.pythonhosted.org/packages/1c/b9/72ac132189ddd4cf40272c2b33f5712f2d5e1bf24641984ff7ef7f0a424c/dhenara_ai-1.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d4927378ff50811212dcb8efeed4bd9073743154abc5f4dd34184e373a7cee63",
                "md5": "b4c591e8b901e7ec19f0f1550a0a8c04",
                "sha256": "fefef1dfba7b6d40eb51abcce0f9346b5bbf2916b45b33052852e1d28900d0a3"
            },
            "downloads": -1,
            "filename": "dhenara_ai-1.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "b4c591e8b901e7ec19f0f1550a0a8c04",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 80325,
            "upload_time": "2025-07-23T08:11:50",
            "upload_time_iso_8601": "2025-07-23T08:11:50.573782Z",
            "url": "https://files.pythonhosted.org/packages/d4/92/7378ff50811212dcb8efeed4bd9073743154abc5f4dd34184e373a7cee63/dhenara_ai-1.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-23 08:11:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "dhenara",
    "github_project": "dhenara-ai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "dhenara-ai"
}
        
Elapsed time: 1.20719s