Name | codedharmony-ai-models JSON |
Version |
0.1.3
JSON |
| download |
home_page | None |
Summary | A package to manage connections to AI model services such as OpenAI and Azure OpenAI. |
upload_time | 2025-02-03 11:10:12 |
maintainer | None |
docs_url | None |
author | None |
requires_python | None |
license | MIT License
Copyright (c) 2025 Online Immigrant Services Ltd
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
|
keywords |
ai
openai
azure
models
sdk
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# codedharmony-ai-models
**codedharmony-ai-models** is a Python package that provides a unified interface for connecting to and interacting with various AI model services—such as OpenAI and Azure OpenAI—using prebuilt functions. By abstracting the connection and interaction logic into reusable functions, this package makes it easy to integrate advanced AI capabilities into your projects while minimizing boilerplate code and simplifying maintenance.
## Overview
Instead of writing repetitive code to initialize API clients, manage credentials, and handle API responses for each service separately, **codedharmony-ai-models** offers dedicated functions that:
- Initialize and configure clients for both OpenAI and Azure OpenAI.
- Load sensitive configuration from environment variables (using a `.env` file).
- Send chat or completion requests with consistent parameters.
- Return parsed responses for immediate use in your application.
Centralizing these responsibilities leads to:
- **Convenience:** Call a single function (e.g., `create_chat_completion()`) without worrying about setting up clients or managing credentials repeatedly.
- **Consistency:** A uniform interface to both OpenAI and Azure OpenAI minimizes the learning curve.
- **Maintainability:** Updates (such as API version changes) need only be made in one place.
- **Security:** Using a `.env` file (and the `python-dotenv` package) allows you to keep sensitive data out of your source code.
## Installation
Install the package using pip (once published to PyPI):
```bash
pip install codedharmony-ai-models
```
## Environment Setup
To keep your API keys and endpoints secure, create a `.env` file in the root of your project with the required variables. For example:
```env
# .env file
# For OpenAI API (if used)
OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=your_model_name
# For Azure OpenAI Service
AZURE_OPENAI_API_KEY=your_azure_api_key_here
AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com
DEPLOYMENT_NAME=your_deployment_name_here
API_VERSION=api_version
```
**Important:**
- Add the .env file to your .gitignore to prevent accidentally committing sensitive information.
- The package automatically loads these variables by calling load_dotenv() (from the python-dotenv package) during initialization. Ensure that you have installed this dependency:
```bash
pip install python-dotenv
```
## Usage Examples
### Using the OpenAI Connector
The **OpenAI connector** provides functions to set up and call the OpenAI API easily. For example:
```python
# File: example_openai.py
from codedharmony_ai_models.openai_connector import create_chat_completion
# Read a prompt from the user
prompt = input("Ask question for OpenAI: ")
# Define a system message for context
system_message = "You are a knowledgeable assistant specializing in artificial intelligence."
# Create a chat completion using the OpenAI API
response_text = create_chat_completion(
prompt,
system_message=system_message,
max_tokens=200, # Adjust maximum tokens as needed
temperature=0.8, # Control the creativity of the response
top_p=1.0,
frequency_penalty=0.0,
presence_penalty=0.0,
stop=None # Optionally set stop sequences
)
print("OpenAI response:", response_text)
```
### Using the Azure OpenAI Connector
The **Azure OpenAI connector** works similarly but is tailored for Azure’s deployment model. Ensure you have set the environment variables AZURE_OPENAI_API_KEY and AZURE_OPENAI_ENDPOINT before running:
```python
# File: example_azure.py
from codedharmony_ai_models.azure_connector import create_azure_chat_completion
# Read a prompt from the user
prompt = input("Ask question for Azure OpenAI: ")
# Create a chat completion using Azure OpenAI Service
response_text = create_azure_chat_completion(
prompt,
max_tokens=200, # Customize as needed
temperature=0.8, # Adjust for more or less randomness
top_p=1.0,
frequency_penalty=0.0,
presence_penalty=0.0,
stop=None # Optionally specify stop sequences
)
print("Azure OpenAI response:", response_text)
```
## Why Use This Package?
Using codedharmony-ai-models offers several key advantages over manually integrating AI APIs:
### Convenience:
The package centralizes the connection logic so that you simply call a function (e.g., create_chat_completion()) without having to set up API clients and manage credentials throughout your code.
### Consistency:
Both OpenAI and Azure OpenAI are accessed through a similar interface. This uniformity reduces the learning curve and minimizes the chance for errors when switching between services.
### Maintainability:
Updates to API versions or connection details only need to be made in one place (inside the connector modules), rather than scattered throughout your project. This decouples your business logic from low-level API details.
### Flexibility:
The abstraction allows you to easily switch endpoints or even add support for additional AI model providers in the future without refactoring your entire codebase.
### Simplified Error Handling:
By encapsulating API calls within dedicated functions, you can standardize error handling and logging, making it easier to debug issues compared to handling raw HTTP responses.
Raw data
{
"_id": null,
"home_page": null,
"name": "codedharmony-ai-models",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "ai, openai, azure, models, sdk",
"author": null,
"author_email": "Andrew Vialichka <andreivialichka@onlineimmigrant.com>",
"download_url": "https://files.pythonhosted.org/packages/25/77/424213acb2af6c80cc55380b7941d4bb321dd95dcfef61d3f14ce80a1e9f/codedharmony_ai_models-0.1.3.tar.gz",
"platform": null,
"description": "# codedharmony-ai-models\n\n**codedharmony-ai-models** is a Python package that provides a unified interface for connecting to and interacting with various AI model services\u2014such as OpenAI and Azure OpenAI\u2014using prebuilt functions. By abstracting the connection and interaction logic into reusable functions, this package makes it easy to integrate advanced AI capabilities into your projects while minimizing boilerplate code and simplifying maintenance.\n\n## Overview\n\nInstead of writing repetitive code to initialize API clients, manage credentials, and handle API responses for each service separately, **codedharmony-ai-models** offers dedicated functions that:\n- Initialize and configure clients for both OpenAI and Azure OpenAI.\n- Load sensitive configuration from environment variables (using a `.env` file).\n- Send chat or completion requests with consistent parameters.\n- Return parsed responses for immediate use in your application.\n\nCentralizing these responsibilities leads to:\n- **Convenience:** Call a single function (e.g., `create_chat_completion()`) without worrying about setting up clients or managing credentials repeatedly.\n- **Consistency:** A uniform interface to both OpenAI and Azure OpenAI minimizes the learning curve.\n- **Maintainability:** Updates (such as API version changes) need only be made in one place.\n- **Security:** Using a `.env` file (and the `python-dotenv` package) allows you to keep sensitive data out of your source code.\n\n\n## Installation\n\nInstall the package using pip (once published to PyPI):\n\n```bash\npip install codedharmony-ai-models\n```\n\n\n\n\n## Environment Setup\n\nTo keep your API keys and endpoints secure, create a `.env` file in the root of your project with the required variables. For example:\n\n```env\n# .env file\n# For OpenAI API (if used)\nOPENAI_API_KEY=your_openai_api_key_here\nOPENAI_MODEL=your_model_name\n\n# For Azure OpenAI Service\nAZURE_OPENAI_API_KEY=your_azure_api_key_here\nAZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com\nDEPLOYMENT_NAME=your_deployment_name_here\nAPI_VERSION=api_version\n```\n\n\n**Important:**\n\n- Add the .env file to your .gitignore to prevent accidentally committing sensitive information.\n- The package automatically loads these variables by calling load_dotenv() (from the python-dotenv package) during initialization. Ensure that you have installed this dependency:\n\n```bash\npip install python-dotenv\n```\n\n\n\n\n## Usage Examples\n\n### Using the OpenAI Connector\n\nThe **OpenAI connector** provides functions to set up and call the OpenAI API easily. For example:\n\n```python\n# File: example_openai.py\nfrom codedharmony_ai_models.openai_connector import create_chat_completion\n\n# Read a prompt from the user\nprompt = input(\"Ask question for OpenAI: \")\n\n# Define a system message for context\nsystem_message = \"You are a knowledgeable assistant specializing in artificial intelligence.\"\n\n# Create a chat completion using the OpenAI API\nresponse_text = create_chat_completion(\n prompt,\n system_message=system_message,\n max_tokens=200, # Adjust maximum tokens as needed\n temperature=0.8, # Control the creativity of the response\n top_p=1.0,\n frequency_penalty=0.0,\n presence_penalty=0.0,\n stop=None # Optionally set stop sequences\n)\n\nprint(\"OpenAI response:\", response_text)\n\n\n```\n\n### Using the Azure OpenAI Connector\n\nThe **Azure OpenAI connector** works similarly but is tailored for Azure\u2019s deployment model. Ensure you have set the environment variables AZURE_OPENAI_API_KEY and AZURE_OPENAI_ENDPOINT before running:\n\n```python\n# File: example_azure.py\nfrom codedharmony_ai_models.azure_connector import create_azure_chat_completion\n\n# Read a prompt from the user\nprompt = input(\"Ask question for Azure OpenAI: \")\n\n\n# Create a chat completion using Azure OpenAI Service\nresponse_text = create_azure_chat_completion(\n prompt,\n max_tokens=200, # Customize as needed\n temperature=0.8, # Adjust for more or less randomness\n top_p=1.0,\n frequency_penalty=0.0,\n presence_penalty=0.0,\n stop=None # Optionally specify stop sequences\n)\n\nprint(\"Azure OpenAI response:\", response_text)\n\n\n```\n\n\n## Why Use This Package?\n\nUsing codedharmony-ai-models offers several key advantages over manually integrating AI APIs:\n\n### Convenience:\nThe package centralizes the connection logic so that you simply call a function (e.g., create_chat_completion()) without having to set up API clients and manage credentials throughout your code.\n\n### Consistency:\nBoth OpenAI and Azure OpenAI are accessed through a similar interface. This uniformity reduces the learning curve and minimizes the chance for errors when switching between services.\n\n### Maintainability:\nUpdates to API versions or connection details only need to be made in one place (inside the connector modules), rather than scattered throughout your project. This decouples your business logic from low-level API details.\n\n### Flexibility:\nThe abstraction allows you to easily switch endpoints or even add support for additional AI model providers in the future without refactoring your entire codebase.\n\n### Simplified Error Handling:\nBy encapsulating API calls within dedicated functions, you can standardize error handling and logging, making it easier to debug issues compared to handling raw HTTP responses.\n",
"bugtrack_url": null,
"license": "MIT License\n \n Copyright (c) 2025 Online Immigrant Services Ltd\n \n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n \n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n \n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE.\n ",
"summary": "A package to manage connections to AI model services such as OpenAI and Azure OpenAI.",
"version": "0.1.3",
"project_urls": null,
"split_keywords": [
"ai",
" openai",
" azure",
" models",
" sdk"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "f476e53113a2b087670c1d3af534d8a3957b951195ac658516720291d81417d5",
"md5": "9bba359c5e1ce06c9166d033c788f5ee",
"sha256": "92840f3f25847d7c893426b75656917b7fb5f216d51b687929450ed1bbf989de"
},
"downloads": -1,
"filename": "codedharmony_ai_models-0.1.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9bba359c5e1ce06c9166d033c788f5ee",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 6786,
"upload_time": "2025-02-03T11:10:09",
"upload_time_iso_8601": "2025-02-03T11:10:09.157519Z",
"url": "https://files.pythonhosted.org/packages/f4/76/e53113a2b087670c1d3af534d8a3957b951195ac658516720291d81417d5/codedharmony_ai_models-0.1.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "2577424213acb2af6c80cc55380b7941d4bb321dd95dcfef61d3f14ce80a1e9f",
"md5": "c3fc7bd21d8ed9cb2166334a9183f7fa",
"sha256": "014d8d01e9e4c98ff3a90a4ae16d8dc6690987af13a53930bf1375bbe30f6f7c"
},
"downloads": -1,
"filename": "codedharmony_ai_models-0.1.3.tar.gz",
"has_sig": false,
"md5_digest": "c3fc7bd21d8ed9cb2166334a9183f7fa",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 5311,
"upload_time": "2025-02-03T11:10:12",
"upload_time_iso_8601": "2025-02-03T11:10:12.064877Z",
"url": "https://files.pythonhosted.org/packages/25/77/424213acb2af6c80cc55380b7941d4bb321dd95dcfef61d3f14ce80a1e9f/codedharmony_ai_models-0.1.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-03 11:10:12",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "codedharmony-ai-models"
}